From 4f267a055b9ced734a8843e6cb145509acf811b5 Mon Sep 17 00:00:00 2001 From: Kaituo Li Date: Fri, 22 Mar 2024 11:48:43 -0700 Subject: [PATCH] Squashed commit of the following: MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit commit 303ceb5b320f148dcd015ab6a11f870ae3e23125 Author: Joshua Palis Date: Mon Mar 18 16:49:59 2024 -0700 Adding 2.13 release notes (#1170) Signed-off-by: Joshua Palis commit 1507dd4b4045bd8e8cdf505d07041ce304788ca0 Author: Tyler Ohlsen Date: Wed Feb 21 15:59:34 2024 -0800 Inject NamedWriteableRegistry in AD node client (#1164) Signed-off-by: Tyler Ohlsen commit 5b85720d442d1b43001dde0c20248fd4a20a64b0 Author: Owais Kazi Date: Tue Feb 13 10:20:20 2024 -0800 Fixed lucene url (#1158) Signed-off-by: owaiskazi19 commit 290bdcd832a0cae6d039787d9f0af73ea6be2ec2 Author: Kaituo Li Date: Tue Feb 6 16:14:49 2024 -0800 add 2.12 release notes (#1152) Signed-off-by: Kaituo Li commit 6a6eec347379c997f0f7acc88b027bc1ffdd8f54 Author: Tyler Ohlsen Date: Tue Jan 30 18:20:36 2024 -0800 Add ser/deser to get AD transport action request (#1150) Signed-off-by: Tyler Ohlsen commit 475f0c36e5b6267aaeb101627b9c3ae1fcca3a88 Author: Owais Kazi Date: Thu Jan 25 14:37:12 2024 -0800 Updated lucene snapshot url (#1146) Signed-off-by: owaiskazi19 commit 7192d848d2923566a409901909f342a63695e1c2 Author: Jackie Han Date: Thu Jan 25 09:48:37 2024 -0800 Remove default admin credentials (#1134) * Remove default admin credentials Signed-off-by: Jackie Han * Replace default password with a placeholder in README files Signed-off-by: Jackie Han --------- Signed-off-by: Jackie Han commit 97482c19b0eded34a7f435e589303a834c258aa3 Author: Jackie Han Date: Wed Jan 24 09:39:55 2024 -0800 Require JDK version for java spotless check (#1129) Signed-off-by: Jackie Han commit 708e0462189677a217b35905c2feba5486c77cdc Author: Andriy Redko Date: Tue Jan 23 15:11:39 2024 -0500 Update to Jackson 2.16.1 (#1135) Signed-off-by: Andriy Redko commit 7c0ce4cb554a05230ecf5365b903219f357997bf Author: Andriy Redko Date: Tue Jan 16 17:51:37 2024 -0500 Update to Gradle 8.5 (#1131) Signed-off-by: Andriy Redko commit e5cbe93c0e9cadebe2a30cba83f99ae8a3f8fe57 Author: zane-neo Date: Thu Jan 4 16:50:47 2024 +0800 add publishToMavenLocal task to fix zip plugin not pulished to maven local issue when running this script (#1121) Signed-off-by: zane-neo commit 106dc25b3ef36beae329dc7882bcdf9c23347fb5 Author: Tyler Ohlsen Date: Wed Dec 27 16:33:25 2023 -0800 Refactor client's getDetectorProfile to use GetAnomalyDetectorTransportAction (#1124) Signed-off-by: Tyler Ohlsen commit 59b4ebecfa11cac977f49c999ce85637fb6fac58 Author: Tyler Ohlsen Date: Wed Dec 27 11:56:00 2023 -0800 Add profile transport action to AD client (#1119) Signed-off-by: Tyler Ohlsen commit 29711f96dbb2c9abc1181b933a8b334eed63ca66 Author: Tyler Ohlsen Date: Tue Dec 12 08:32:53 2023 -0800 Add an AD transport client (#1110) Signed-off-by: Tyler Ohlsen commit 353dcaeb05f8b75fa74d92e2ae8415ab018bc68e Author: Daniel Widdis Date: Wed Nov 22 13:59:53 2023 -0800 Fix build, update CVE-affected versions (#1102) * Fix build, update CVE-affected versions Signed-off-by: Daniel Widdis * Spotless depends on CVE-impacted eclipse dependency, now needs JDK17+ Signed-off-by: Daniel Widdis --------- Signed-off-by: Daniel Widdis commit 4c6ba48bf9fb9234ad8bc0b9193f3c68409acfb9 Author: Peter Zhu Date: Mon Oct 30 19:35:01 2023 -0400 Fix the bwc test version retrieval (#1093) * Onboarding jenkins prod docker images to github actions Signed-off-by: Peter Zhu * Add temurin Signed-off-by: Peter Zhu * Add temurin Signed-off-by: Peter Zhu * Add more Signed-off-by: Peter Zhu * Add more Signed-off-by: Peter Zhu * Add more Signed-off-by: Peter Zhu --------- Signed-off-by: Peter Zhu commit a1da3aa0eefc6671446d2eb3bc670bad6449d2f0 Author: Peter Zhu Date: Mon Oct 30 19:14:31 2023 -0400 Onboarding jenkins prod docker images to github actions (#1092) * Onboarding jenkins prod docker images to github actions Signed-off-by: Peter Zhu * Add temurin Signed-off-by: Peter Zhu * Add temurin Signed-off-by: Peter Zhu * Add more Signed-off-by: Peter Zhu --------- Signed-off-by: Peter Zhu commit 040cd7e33f8e696ffe7748140c9b0ecc7475d513 Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue Oct 10 12:43:02 2023 -0700 dependabot: bump net.bytebuddy:byte-buddy from 1.14.6 to 1.14.9 (#1081) Bumps [net.bytebuddy:byte-buddy](https://github.com/raphw/byte-buddy) from 1.14.6 to 1.14.9. - [Release notes](https://github.com/raphw/byte-buddy/releases) - [Changelog](https://github.com/raphw/byte-buddy/blob/master/release-notes.md) - [Commits](https://github.com/raphw/byte-buddy/compare/byte-buddy-1.14.6...byte-buddy-1.14.9) --- updated-dependencies: - dependency-name: net.bytebuddy:byte-buddy dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit dd9c1521bc1946027245d65e39a7844cdf04a8b7 Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue Oct 10 12:42:15 2023 -0700 dependabot: bump aws-actions/configure-aws-credentials (#1076) Bumps [aws-actions/configure-aws-credentials](https://github.com/aws-actions/configure-aws-credentials) from 1.7.0 to 4.0.1. - [Release notes](https://github.com/aws-actions/configure-aws-credentials/releases) - [Changelog](https://github.com/aws-actions/configure-aws-credentials/blob/main/CHANGELOG.md) - [Commits](https://github.com/aws-actions/configure-aws-credentials/compare/v1.7.0...v4.0.1) --- updated-dependencies: - dependency-name: aws-actions/configure-aws-credentials dependency-type: direct:production update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit d6e3fd7a69b1daea45d7a455335e55b6de118033 Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue Oct 10 12:05:00 2023 -0700 dependabot: bump com.netflix.nebula.ospackage from 11.0.0 to 11.5.0 (#1066) Bumps com.netflix.nebula.ospackage from 11.0.0 to 11.5.0. --- updated-dependencies: - dependency-name: com.netflix.nebula.ospackage dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit 57eacf611ce44559e14aa37f2e51dfbb82d53cc2 Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue Oct 10 12:04:19 2023 -0700 dependabot: bump org.apiguardian:apiguardian-api from 1.1.0 to 1.1.2 (#1058) Bumps [org.apiguardian:apiguardian-api](https://github.com/apiguardian-team/apiguardian) from 1.1.0 to 1.1.2. - [Release notes](https://github.com/apiguardian-team/apiguardian/releases) - [Commits](https://github.com/apiguardian-team/apiguardian/compare/r1.1.0...r1.1.2) --- updated-dependencies: - dependency-name: org.apiguardian:apiguardian-api dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit 6b9faf2b71e5481be366b12bd6f16440c6c8ef89 Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue Oct 10 12:04:01 2023 -0700 dependabot: bump org.junit.platform:junit-platform-launcher (#1056) Bumps [org.junit.platform:junit-platform-launcher](https://github.com/junit-team/junit5) from 1.9.2 to 1.10.0. - [Release notes](https://github.com/junit-team/junit5/releases) - [Commits](https://github.com/junit-team/junit5/commits) --- updated-dependencies: - dependency-name: org.junit.platform:junit-platform-launcher dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit 2a957f2086688b5ee76703406105797d31207cc5 Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue Oct 10 12:03:11 2023 -0700 Bump urllib3 from 1.26.9 to 1.26.17 in /dataGeneration (#1068) Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.9 to 1.26.17. - [Release notes](https://github.com/urllib3/urllib3/releases) - [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst) - [Commits](https://github.com/urllib3/urllib3/compare/1.26.9...1.26.17) --- updated-dependencies: - dependency-name: urllib3 dependency-type: direct:production ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit a12e33214f9c138946975801cefa2bd39c7978a9 Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue Oct 10 11:29:37 2023 -0700 dependabot: bump org.objenesis:objenesis from 3.0.1 to 3.3 (#1040) Bumps [org.objenesis:objenesis](https://github.com/easymock/objenesis) from 3.0.1 to 3.3. - [Release notes](https://github.com/easymock/objenesis/releases) - [Commits](https://github.com/easymock/objenesis/compare/3.0.1...3.3) --- updated-dependencies: - dependency-name: org.objenesis:objenesis dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit 5a4ba14a431fbc53f757416542cda25ef2fae58c Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue Oct 10 11:28:46 2023 -0700 dependabot: bump tibdex/github-app-token from 2.0.0 to 2.1.0 (#1065) Bumps [tibdex/github-app-token](https://github.com/tibdex/github-app-token) from 2.0.0 to 2.1.0. - [Release notes](https://github.com/tibdex/github-app-token/releases) - [Commits](https://github.com/tibdex/github-app-token/compare/v2.0.0...v2.1.0) --- updated-dependencies: - dependency-name: tibdex/github-app-token dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit 713d556d980ccb144eb6ae952cebc7b3963ed28c Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Wed Sep 13 15:05:44 2023 -0700 dependabot: bump actions/setup-java from 1 to 3 (#1035) Bumps [actions/setup-java](https://github.com/actions/setup-java) from 1 to 3. - [Release notes](https://github.com/actions/setup-java/releases) - [Commits](https://github.com/actions/setup-java/compare/v1...v3) --- updated-dependencies: - dependency-name: actions/setup-java dependency-type: direct:production update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit 574ac521653ede84e7b821bc3c06049f1054a6ad Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Wed Sep 13 10:53:02 2023 -0700 dependabot: bump tibdex/github-app-token from 1.5.0 to 2.0.0 (#1042) Bumps [tibdex/github-app-token](https://github.com/tibdex/github-app-token) from 1.5.0 to 2.0.0. - [Release notes](https://github.com/tibdex/github-app-token/releases) - [Commits](https://github.com/tibdex/github-app-token/compare/v1.5.0...v2.0.0) --- updated-dependencies: - dependency-name: tibdex/github-app-token dependency-type: direct:production update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit 65e207aef49a6872408ae02e4f49cdd9060c1ef0 Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Wed Sep 13 10:52:40 2023 -0700 dependabot: bump org.apache.commons:commons-pool2 from 2.10.0 to 2.11.1 (#1045) Bumps org.apache.commons:commons-pool2 from 2.10.0 to 2.11.1. --- updated-dependencies: - dependency-name: org.apache.commons:commons-pool2 dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit a4ff71926af39726d9b4584102706e3ea7f4fa05 Author: Craig Perkins Date: Mon Sep 11 14:16:31 2023 -0400 React to Tracer changes in TransportService (#1034) Signed-off-by: Craig Perkins commit 05d0a3b905cae9d25f6d5d91f30f1b842b9a792f Author: Craig Perkins Date: Mon Sep 11 14:14:20 2023 -0400 Ensure integ tests run with security after plugin rename (#1023) * Ensure integ tests run with security after plugin rename Signed-off-by: Craig Perkins * Rename to time-series-analytics Signed-off-by: Craig Perkins * Switch folder back Signed-off-by: Craig Perkins * Run integTest with -i Signed-off-by: Craig Perkins * Remove opensearch-anomaly-detection if installed Signed-off-by: Craig Perkins * Update password rules and change expected error msg Signed-off-by: Craig Perkins * Update password generation Signed-off-by: Craig Perkins * Fix indexOf condition Signed-off-by: Craig Perkins --------- Signed-off-by: Craig Perkins commit 72210f014ab54bf4a5b567ebe9dabe13b66c6cbb Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Mon Sep 11 11:13:39 2023 -0700 dependabot: bump codecov/codecov-action from 1 to 3 (#1038) Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 1 to 3. - [Release notes](https://github.com/codecov/codecov-action/releases) - [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md) - [Commits](https://github.com/codecov/codecov-action/compare/v1...v3) --- updated-dependencies: - dependency-name: codecov/codecov-action dependency-type: direct:production update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit b77c3a7406e94424550ec1f22f084bc20533994f Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Mon Sep 11 11:13:14 2023 -0700 dependabot: bump com.google.guava:guava from 32.0.1-jre to 32.1.2-jre (#1037) Bumps [com.google.guava:guava](https://github.com/google/guava) from 32.0.1-jre to 32.1.2-jre. - [Release notes](https://github.com/google/guava/releases) - [Commits](https://github.com/google/guava/commits) --- updated-dependencies: - dependency-name: com.google.guava:guava dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit 18defcc16e81abe590e8f109d6df3c4a060f74ce Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Mon Sep 11 11:13:03 2023 -0700 dependabot: bump VachaShah/backport from 1.1.4 to 2.2.0 (#1039) Bumps [VachaShah/backport](https://github.com/vachashah/backport) from 1.1.4 to 2.2.0. - [Release notes](https://github.com/vachashah/backport/releases) - [Changelog](https://github.com/VachaShah/backport/blob/main/CHANGELOG.md) - [Commits](https://github.com/vachashah/backport/compare/v1.1.4...v2.2.0) --- updated-dependencies: - dependency-name: VachaShah/backport dependency-type: direct:production update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit 726759f104117e1a28648a1524cdface49168b79 Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Mon Sep 11 11:12:53 2023 -0700 dependabot: bump actions/checkout from 2 to 4 (#1041) Bumps [actions/checkout](https://github.com/actions/checkout) from 2 to 4. - [Release notes](https://github.com/actions/checkout/releases) - [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md) - [Commits](https://github.com/actions/checkout/compare/v2...v4) --- updated-dependencies: - dependency-name: actions/checkout dependency-type: direct:production update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit f6ac01886a662264323d1e11a2199fb190148c80 Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Mon Sep 11 11:12:38 2023 -0700 dependabot: bump org.jacoco:org.jacoco.ant from 0.8.5 to 0.8.10 (#1044) Bumps [org.jacoco:org.jacoco.ant](https://github.com/jacoco/jacoco) from 0.8.5 to 0.8.10. - [Release notes](https://github.com/jacoco/jacoco/releases) - [Commits](https://github.com/jacoco/jacoco/compare/v0.8.5...v0.8.10) --- updated-dependencies: - dependency-name: org.jacoco:org.jacoco.ant dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit d30bcc394e6f281b6de504f102a0085ae56e78e4 Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Mon Sep 11 11:12:22 2023 -0700 dependabot: bump org.junit.jupiter:junit-jupiter-api (#1043) Bumps [org.junit.jupiter:junit-jupiter-api](https://github.com/junit-team/junit5) from 5.9.2 to 5.10.0. - [Release notes](https://github.com/junit-team/junit5/releases) - [Commits](https://github.com/junit-team/junit5/compare/r5.9.2...r5.10.0) --- updated-dependencies: - dependency-name: org.junit.jupiter:junit-jupiter-api dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit 92256db92f9ed78cb1de9925297c9ff5cce6da74 Author: Craig Perkins Date: Mon Sep 11 14:04:23 2023 -0400 Add dependabot.yml (#1026) Signed-off-by: Craig Perkins commit 23611f6d8ccdc9a4c41469ec0a4746290f976b4f Author: Jackie Han Date: Fri Sep 8 11:19:31 2023 -0700 Add 2.10 release notes (#1031) Signed-off-by: Jackie Han commit ee1db57058e08c36e4820637973f18016cf45015 Author: Owais Kazi Date: Wed Sep 6 16:55:00 2023 -0700 Updates demo certs for integ tests (#1018) Signed-off-by: Owais Kazi commit 338d72eb692c81802a70fe42556cde70c01cad6a Author: Kaituo Li Date: Wed Sep 6 16:08:01 2023 -0700 refactor job response (#1017) Signed-off-by: Kaituo Li commit db456c2543a6f1ff6b77db10bd19952768d16182 Author: Jackie Han Date: Tue Sep 5 15:47:50 2023 -0700 upgrading commons-lang3 version to fix conflict issue (#1012) * force commons-lang3 version to fix conflict issue Signed-off-by: Jackie Han * getting commons-lang3 version from upstream version properties file Signed-off-by: Jackie Han * getting commons-lang3 version from core Signed-off-by: Jackie Han * getting commons-lang3 version from core Signed-off-by: Jackie Han * upgrading commons-lang3 version to align with core Signed-off-by: Jackie Han --------- Signed-off-by: Jackie Han commit db788f4714233121eba4238869b8db89ae0037d4 Author: Owais Kazi Date: Tue Sep 5 13:58:31 2023 -0700 Adds auto release workflow (#1003) Signed-off-by: owaiskazi19 commit 327c141e9b70465b1760f9ebf8c001a2e439b01f Author: Jackie Han Date: Fri Sep 1 13:00:23 2023 -0700 Revert "Enforce DOCUMENT Replication for AD Indices and Adjust Primary Shards (#948)" (#999) This reverts commit bc1649922ef39619424fc16c64af5836e48a4a12. commit 4a09b741e1c127a92775cb55f4c56aa5c4f3f6a3 Author: Kaituo Li Date: Mon Aug 21 11:21:36 2023 -0700 Update Gradle Wrapper to 8.2.1 for Enhanced JDK20 Compatibility (#985) - **Reason for Upgrade**: The migration to JDK20 ([reference](https://github.com/opensearch-project/opensearch-build/blob/aa65a8ecd69f77c3d3104043dd1c48dff708bffa/manifests/3.0.0/opensearch-3.0.0.yml#L9)) rendered the current Gradle version (7.6.1) incompatible. - **Actions Taken**: - **Gradle Wrapper Update**: Upgraded the Gradle wrapper to version 8.2.1 to maintain compatibility with JDK20. The gradle wrapper files are generated using the `./gradlew wrapper` command. - Applied `spotless` due to new formatting requirements in Gradle 8. - Resolved test "jar hell" issues. Gradle 8 introduced internal JARs to the test classpath that conflicted with dependencies from `org.junit.vintage:junit-vintage-engine`. As a remedy, these conflicting JARs have been excluded. - **Relevant Pull Requests**: - [Alerting#893](https://github.com/opensearch-project/alerting/pull/893/files) - [ML-Commons#892](https://github.com/opensearch-project/ml-commons/pull/892) - [Security PR](https://github.com/opensearch-project/security/pull/2978) - **Verification**: Successfully verified the changes using `gradle build`. Signed-off-by: Kaituo Li commit 5ac6390e7de643d8177f0a5833995032f3a279ad Author: Kaituo Li Date: Thu Aug 17 11:08:36 2023 -0700 Refactoring task cache manager for forecasting (#982) * Refactoring Shared Functionality for Broader Task Compatibility This PR extracts shared components from `ADTaskCacheManager` into `TaskCacheManager` to support the requirements of the forecasting feature. **Renamings**: - **Method-level in `ADTaskCacheManager`**: - `addDeletedDetector` to `addDeletedConfig` - `addDeletedDetectorTask` to `addDeletedTask` - `hasDeletedDetectorTask` to `hasDeletedTask` - `pollDeletedDetector` to `pollDeletedConfig` - `pollDeletedDetectorTask` to `pollDeletedTask` - **Variable-level in `AnomalyDetectorSettings`**: - `CHECKPOINT_MAINTAIN_QUEUE_MAX_HEAP_PERCENT` to `AD_CHECKPOINT_MAINTAIN_QUEUE_MAX_HEAP_PERCENT` - `CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT` to `AD_CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT` - `CHECKPOINT_SAVING_FREQ` to `AD_CHECKPOINT_SAVING_FREQ` - `CHECKPOINT_TTL` to `AD_CHECKPOINT_TTL` - `CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT` to `AD_CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT` - `COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT` to `AD_COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT` - `DEDICATED_CACHE_SIZE` to `AD_DEDICATED_CACHE_SIZE` - `ENTITY_COLD_START_QUEUE_CONCURRENCY` to `AD_ENTITY_COLD_START_QUEUE_CONCURRENCY` - `ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT` to `AD_ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT` - `EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS` to `AD_EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS` - `FILTER_BY_BACKEND_ROLES` to `AD_FILTER_BY_BACKEND_ROLES` - `MAX_ENTITIES_PER_QUERY` to `AD_MAX_ENTITIES_PER_QUERY` - `MAX_MODEL_SIZE_PER_NODE` to `AD_MAX_MODEL_SIZE_PER_NODE` - `MAX_MULTI_ENTITY_ANOMALY_DETECTORS` to `AD_MAX_HC_ANOMALY_DETECTORS` - `MAX_RETRY_FOR_END_RUN_EXCEPTION` to `AD_MAX_RETRY_FOR_END_RUN_EXCEPTION` - `MAX_SINGLE_ENTITY_ANOMALY_DETECTORS` to `AD_MAX_SINGLE_ENTITY_ANOMALY_DETECTORS` - `MODEL_MAX_SIZE_PERCENTAGE` to `AD_MODEL_MAX_SIZE_PERCENTAGE` - `PAGE_SIZE` to `AD_PAGE_SIZE` - `RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT` to `AD_RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT` - **Class-level**: - `ADRealtimeTaskCache` renamed to `RealtimeTaskCache` - **Package-level**: - Shifted from `org.opensearch.ad.breaker` to `org.opensearch.timeseries.breaker` **Migrations**: - Variables transferred from `AnomalyDetectorSettings` to `TimeSeriesSettings`: - `BATCH_BOUNDING_BOX_CACHE_RATIO` - `CHECKPOINT_MAINTAIN_REQUEST_SIZE_IN_BYTES` - `CHECKPOINT_WRITE_QUEUE_SIZE_IN_BYTES` - `DEFAULT_AD_JOB_LOC_DURATION_SECONDS` (renamed to `DEFAULT_JOB_LOC_DURATION_SECONDS`) - `ENTITY_REQUEST_SIZE_IN_BYTES` - `HOURLY_MAINTENANCE` - `INTERVAL_RATIO_FOR_REQUESTS` - `LOW_SEGMENT_PRUNE_RATIO` - `MAINTENANCE_FREQ_CONSTANT` - `MAX_COLD_START_ROUNDS` - `MAX_QUEUED_TASKS_RATIO` - `MAX_TOTAL_RCF_SERIALIZATION_BUFFERS` - `MAX_CHECKPOINT_BYTES` - `MEDIUM_SEGMENT_PRUNE_RATIO` - `NUM_MIN_SAMPLES` - `NUM_SAMPLES_PER_TREE` - `NUM_TREES` - `QUEUE_MAINTENANCE` - `RESULT_WRITE_QUEUE_SIZE_IN_BYTES` - `SERIALIZATION_BUFFER_BYTES` - `THRESHOLD_MIN_PVALUE` - `TIME_DECAY` **Deletions**: - Obsolete settings and methods: - `DESIRED_MODEL_SIZE_PERCENTAGE` in `AnomalyDetectorSettings` - `getDesiredModelSize` in `MemoryTracker` **Modifications**: - `MemoryTracker` enums renamed for clear differentiation between real-time and historical memory usage, adding `REAL_TIME_FORECASTER` for a harmonized single-stream and HC analysis approach. **Tests**: - Changes validated with a successful Gradle build. Signed-off-by: Kaituo Li * improve comments Signed-off-by: Kaituo Li * improve comments Signed-off-by: Kaituo Li --------- Signed-off-by: Kaituo Li commit 0c5b4b94cb5b192160463fa8a5f58eac587fe95d Author: Kaituo Li Date: Tue Aug 15 13:09:17 2023 -0700 Update RCF to v3.8 and Enable Auto AD with 'Alert Once' Option (#979) * Update RCF to v3.8 and Enable Auto AD with 'Alert Once' Option This PR added support for automatic Anomaly Detection (AD) and the 'Alert Once' option introduced in RCF 3.8. Testing done: 1. Deserialization Test: * Verified model deserialization from 3.0-rc3. * Ensured consistent scoring using the rc3 checkpoint and rc3 dependency on identical data. 2. Backward Compatibility Test: * Executed a mixed cluster with versions 2.10 and 3.0. * Validated that older detectors still produce results without throwing any exceptions in a blue-green simulation scenario. Signed-off-by: Kaituo Li * reduce recall since alertOnce reduced recall Signed-off-by: Kaituo Li * remove commented out code Signed-off-by: Kaituo Li --------- Signed-off-by: Kaituo Li commit cadc9bbd949daf4f2669335bfb8629e2137c108d Author: Joshua Palis Date: Fri Aug 11 16:02:38 2023 -0700 Fixing imports, adding TermsAggregator BucketCountThresholds for StringTerms, setting AnomalyResult grade to 0 if negative value (#976) Signed-off-by: Joshua Palis commit 3985cd2f82f2cfa57604ecf5a512bb042e632cae Author: Romain Tartière Date: Wed Aug 9 08:59:30 2023 -1000 Fix build with latest OpenSearch (for real this time) (#972) This is a follow-up to #971, where commit 2839bc2c6f470639b68901086c80cb2152d96558 only fixed _some_ but not _all_ of the build errors (it allowed `./gradlew run` to pass but `./gradlew build` was failing). Fix the build errors when we run `./gradlew build`. Signed-off-by: Romain Tartière commit 2839bc2c6f470639b68901086c80cb2152d96558 Author: Romain Tartière Date: Tue Aug 8 13:15:04 2023 -1000 Fix build with latest OpenSearch (#971) OpenSearch recently merged some code that cause buid failure for this module, as spotted in #585. This change the import to match the new one after https://github.com/opensearch-project/OpenSearch/pull/9103 Signed-off-by: Romain Tartière commit d4946f04e7ad44f0bc79215962c6678786c592ff Author: Kaituo Li Date: Mon Aug 7 14:21:56 2023 -0700 Refactor ADTask and Related Components (#969) * Refactor ADTask and Related Components This PR includes several key refactoring changes: - Extracts common code from ADTask into TimeSeriesTask, creating ForecastTask for forecasting-specific logic. - Consolidates common code from ADTaskType into TaskType and introduces ForecastTaskType for forecasting-related purposes. - Renames ADTaskState to TaskState for consistent code reuse. - Renames the method getId in ADTask to getConfigId to differentiate it from other IDs like task id. Testing done: 1. Added unit tests for the new code to ensure functionality. 2. Executed a successful Gradle build. Signed-off-by: Kaituo Li * add comments and address compiler errors Signed-off-by: Kaituo Li * address Amit's comments and address compiler failure Signed-off-by: Kaituo Li --------- Signed-off-by: Kaituo Li commit 1130a1ba6fc3c08f375dd99f98e93e4ee36331fb Author: Kaituo Li Date: Thu Jul 27 10:39:39 2023 -0700 Refactoring NodeStateManager etc. to support forecasting functionality (#965) * Refactoring NodeStateManager etc. to support forecasting functionality This commit extends the codebase to support both Anomaly Detection (AD) and forecasting. It contains a mixture of refactoring, renaming, removal of unused code, and package moving tasks. Here are the details: Refactoring: - `NodeStateManager.getAnomalyDetector` is now `getConfig`, with added functionality to fetch a Forecaster. The method comments are updated for clarity. - Existing methods (`getFeatureSamplesForPeriods`, `getColdStartSamplesForPeriods`, `createPreviewSearchRequest`, `getMinDataTime`) have been added in `SearchFeatureDao` to handle forecasting logic. - Adjusted `SecurityClientUtil` and `ParseUtils` to handle forecasting logic. - Cleaned up `NodeState` to differentiate state for AD and forecasting. Renaming: - `AnomalyDetectorJob` is renamed to `Job` to facilitate reuse for forecasting. - `NodeStateManager.getAnomalyDetectorJob` is renamed to `getJob`. - Certain settings in `AnomalyDetectorSettings` are renamed to reflect they are meant for the AD setting. They have been marked as deprecated and new settings are used in `TimeSeriesSettings` instead. - `IndexAnomalyDetectorJobActionHandler.getAnomalyDetectorJobForWrite` is renamed to `getJobForWrite`. - `ADSafeSecurityInjector` is renamed to `TimeSeriesSafeSecurityInjector`. Removing unused code: - Synchronous code in `ClientUtil`, `IndexUtils`, and `CheckpointDao` is removed. - The unused class `Throttler` is deleted. - Mapping file names are changed, and the code referencing these files is adjusted. Package moving: - Several classes (`ClientUtil`, `MultiResponsesDelegateActionListener`, `SafeSecurityInjector`, `SecurityUtil`, `ExceptionUtil`, `SearchFeatureDao`, `CleanState`, `ExpiringState`, `MaintenanceState`, `NodeState`, `SingleStreamModelIdMapper`, `BackPressureRouting`) are moved to the respective `org.opensearch.timeseries` packages. Miscellaneous: - Fixed compiler failures caused by changes in https://github.com/opensearch-project/OpenSearch/pull/8730 by replacing `DoubleArrayList` with `java.util.ArrayList`. - Updates the Backwards Compatibility (bwc) version to align with the core's incremented bwc version as per [OpenSearch PR #8670](https://github.com/opensearch-project/OpenSearch/pull/8670). This change prevents the issue described in [OpenSearch Issue #5076](https://github.com/opensearch-project/OpenSearch/issues/5076). Testing: - Executed a `gradle build`. - Added new tests for `ClientUtil` and `NodeStateManager`. Signed-off-by: Kaituo Li * improve comment Signed-off-by: Kaituo Li * fix compiler error and comments Signed-off-by: Kaituo Li --------- Signed-off-by: Kaituo Li commit f0ed43b1aa3c5d80dc6f23a5dd4465bb03ef1739 Author: Kaituo Li Date: Mon Jul 17 15:18:03 2023 -0700 Rename AD to time series analytics & Resolve Compiler Errors (#951) This commit focuses on renaming the AnomalyDetectorPlugin to TimeSeriesAnalyticsPlugin as well as addressing compiler errors due to package import changes. The primary transformations are: * Renamed AnomalyDetectorPlugin to TimeSeriesAnalyticsPlugin. * Updated all references from AnomalyDetectorPlugin to TimeSeriesAnalyticsPlugin. * Amended plugin information in build.gradle and src/main/resources/es-plugin.properties. * Switched job type in TimeSeriesAnalyticsPlugin from opendistro_anomaly_detector to opensearch_time_series_analytics. These changes led to the modification of our plugin binary from opensearch-anomaly-detection-3.0.0.0-SNAPSHOT.jar to opensearch-time-series-analytics-3.0.0.0-SNAPSHOT.jar. Moreover, following changes from OpenSearch Pull Request #8157, compiler errors were encountered, which were resolved by updating the import packages * org.opensearch.common.io.stream.NamedWriteableRegistry => org.opensearch.core.common.io.stream.NamedWriteableRegistry * org.opensearch.common.xcontent.XContentParserUtils => * org.opensearch.core.xcontent.XContentParserUtils * org.opensearch.common.bytes.BytesArray => org.opensearch.core.common.bytes.BytesArray * org.opensearch.common.io.stream.StreamInput => * org.opensearch.core.common.io.stream.StreamInput ** org.opensearch.common.io.stream.StreamOutput => org.opensearch.core.common.io.stream.StreamOutput * org.opensearch.common.io.stream.Writeable => org.opensearch.core.common.io.stream.Writeable * org.opensearch.common.io.stream.NotSerializableExceptionWrapper => org.opensearch.core.common.io.stream.NotSerializableExceptionWrapper * org.opensearch.rest.RestStatus => org.opensearch.core.rest.RestStatus * org.opensearch.common.ParsingException => org.opensearch.core.common.ParsingException * org.opensearch.common.bytes.BytesReference => org.opensearch.core.common.bytes.BytesReference * org.opensearch.common.io.stream.InputStreamStreamInput => org.opensearch.core.common.io.stream.InputStreamStreamInput * org.opensearch.common.io.stream.OutputStreamStreamOutput => org.opensearch.core.common.io.stream.OutputStreamStreamOutput * org.opensearch.index.shard.ShardId => org.opensearch.core.index.shard.ShardId* org.opensearch.index.Index => org.opensearch.core.index.Index * org.opensearch.core.index.IndexNotFoundException => org.opensearch.index.IndexNotFoundException Validation steps were executed to ensure these changes did not break functionality: * gradle build was successful. A local build was required due to outdated OpenSearch and job scheduler nightly builds. * Backward compatibility tests were performed on a two-node 2.9 cluster running some detectors. These detectors continued to produce results on a 3.0 cluster with these updates. * I can start new detectors on the 3.0 node. * After shutting down a 2.9 node, the job transfer to the 3.0 node was successful and the job continued to perform as expected. Signed-off-by: Kaituo Li commit c7c6a46e88e15f615fc9dae0499cf0d674662057 Author: Tyler Ohlsen Date: Thu Jul 13 15:42:46 2023 -0700 Add 2.9 release notes (#952) Signed-off-by: Tyler Ohlsen commit bc1649922ef39619424fc16c64af5836e48a4a12 Author: Kaituo Li Date: Tue Jul 11 13:34:09 2023 -0700 Enforce DOCUMENT Replication for AD Indices and Adjust Primary Shards (#948) * Enforce DOCUMENT Replication for AD Indices and Adjust Primary Shards In this PR, we temporarily enforce DOCUMENT replication for AD indices. This change is necessary due to the current limitation of SegRep, which doesn't support Get/MultiGet by ID. This measure will be in place until SegRep adds support for these operations. This adjustment aligns with the modification made in the referenced PR: job-scheduler PR #417 Additionally, this PR increases the number of primary shards for forecasting indices from 10 to 20, recognizing the higher write intensity of these indices. For instance, when the horizon is 24, we are expected to save 25 documents: 24 for each forecast and 1 for the actual value. In contrast, for AD, we save one document per actual value. Testing done: 1. gradle build passed Signed-off-by: Kaituo Li * Fix comments Signed-off-by: Kaituo Li --------- Signed-off-by: Kaituo Li commit a1203827617ee42b35d41cb8b010719f334f73b7 Author: Kaituo Li Date: Mon Jul 10 10:53:17 2023 -0700 Add ForecastResult and Refactor Shared Code (#941) This commit adds the functionality of ForecastResult, while also performing several necessary refactorings. Firstly, we introduce ForecastResult, which closely mirrors the functionality of AnomalyResult. Any shared code between these two classes has been abstracted out into a new parent class, IndexableResult. This refactoring was undertaken to facilitate code reuse and improve maintainability, as both classes deal with storing fields that need to be saved in result indices. Secondly, we've also extracted common code from ThresholdingResult into another new class, IntermediateResult. This work is in preparation for a forthcoming PR which will introduce RCFCasterResult. In this change, we moved the 'toAnomalyResult' method to IndexableResult, and renamed it to 'toIndexableResults'. This renaming allows for reuse in the upcoming RCFCasterResult. The updated method now returns a list of results instead of a single result. The shift caters for the RCFCasterResult use case where one RCFCasterResult is stored across multiple ForecastResult documents. Furthermore, this commit modifies the 'entity' field in several methods to be of the Optional type, indicating that it can be empty. This change provides a clear signal to other developers about the optional nature of these fields. In addition, this commit updates the mapping for the forecast result index. The prior mapping was a placeholder and this change brings it up to date. We've also moved to using httpcore5 and httpclient5 versions from OpenSearch core, as necessitated by recent changes in [OpenSearch PR #8434](https://github.com/opensearch-project/OpenSearch/pull/8434/files). This shift resolves potential jar hell issues with httpcore5 and httpclient5. Lastly, we've made name and package changes: 1. Renamed 'initDetectionStateIndex' in ADIndexManagement to 'initStateIndex' to align with ForecastIndexManagement naming. 2. Moved the 'DataByFeatureId' class to the 'timeseries' package for better organization. This commit also completes the 'createDummyIndexRequest' method in ForecastIndexManagement by invoking 'ForecastResult.getDummyResult()', which is now possible due to the introduction of the ForecastResult implementation. Testing done: 1. added tests for new code 2. gralde build passes 3. e2e tests Signed-off-by: Kaituo Li commit f24d9e32ff739abf29cac3da15fd00f74219e213 Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu Jul 6 17:28:09 2023 -0700 Bump scipy from 1.8.0 to 1.10.0 in /dataGeneration (#943) Bumps [scipy](https://github.com/scipy/scipy) from 1.8.0 to 1.10.0. - [Release notes](https://github.com/scipy/scipy/releases) - [Commits](https://github.com/scipy/scipy/compare/v1.8.0...v1.10.0) --- updated-dependencies: - dependency-name: scipy dependency-type: direct:production ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> commit 9b9fd2ba3f3e678d717c0a45af8069bb67828808 Author: Amit Galitzky Date: Tue Jun 27 16:08:55 2023 -0700 bump guava version to 32.0.1 (#933) Signed-off-by: Amit Galitzky commit 99749dfa6bfc8a2f2c4660ca01d75d1ad57124a5 Author: Kaituo Li Date: Mon Jun 26 16:46:24 2023 -0700 Refactoring Index Creation for Improved Code Reuse and Consistency (#932) This Pull Request significantly refactors the index creation module to promote greater code reusability and consistency. The primary changes are outlined as follows: 1. Code Migration: Common code segments have been moved from AnomalyDetectionIndices to IndexManagement. In addition, component-specific code has been established in ADIndexManagement and ForecastIndexManagement. The bulk of changes are concentrated in these three classes, with the remainder of the modifications relating to reference/test changes. 2. Function Renaming: Several functions have been renamed for broader applicability and consistency: * AnomalyDetectorFunction has been renamed to ExecutorFunction for potential use in Forecasting. * AnomalyDetectorSettings.MAX_PRIMARY_SHARDS has been renamed to AnomalyDetectorSettings.AD_MAX_PRIMARY_SHARDS in light of the new ForecastSettings.FORECAST_MAX_PRIMARY_SHARDS. * doesAnomalyDetectorJobIndexExist() has been renamed to doesJobIndexExist() to allow for job index reusability across Anomaly Detection (AD) and forecasting. Analogous changes have been made to other job index-related functions. * doesAnomalyDetectorIndexExist() has been renamed to doesConfigIndexExist() to allow for config index reusability across AD and forecasting. Analogous changes have been made to other config index-related functions. * detectionIndices.doesDetectorStateIndexExist() has been renamed to detectionIndices.doesStateIndexExist(), as the former name was unnecessarily redundant. Similar modifications have been made to the result and checkpoint index. 3. Class Migration: The classes ThrowingSupplierWrapper, ThrowingConsumer, and ThrowingSupplier have been moved from org.opensearch.ad.util to org.opensearch.timeseries.function to promote code reuse. 4. Initial Forecast Indices' Mapping: An initial version of forecast indices' mapping has been added, which can be adjusted as required in the future. 5. In terms of version updates, 2.x has been bumped to 2.9, prompting an increment of the Backward Compatibility (BWC) test version to 2.9. 6. Update dependency com.google.guava:guava to v32 for cve fix. Testing done: Testing has been performed with new tests added for forecast index creation, and the grade build passes successfully. Signed-off-by: Kaituo Li commit aedb781a6b1df984636d72d50b65d41138539918 Author: Craig Perkins Date: Tue Jun 20 16:55:19 2023 -0400 Fix main build - update import of Releasable and remove reference to BaseExceptionsHelper (#930) * Fix main build - update import of Releasable and remove reference to BaseExceptionsHelper Signed-off-by: Craig Perkins * Update imports in test classes Signed-off-by: Craig Perkins --------- Signed-off-by: Craig Perkins commit fce78c29c0375ae06f56e16c85fb98933401e28f Author: Kaituo Li Date: Wed Jun 14 10:36:40 2023 -0700 Refactor: Reorganize code for Forecasting and AnomalyDetector packages (#925) * Refactor: Reorganize code for Forecasting and AnomalyDetector packages This PR includes several refactorings to increase code reuse and improve readability: 1. Entity, FeatureData, and RestHandlerUtils have been moved from the 'ad' package to the 'timeseries' package, enabling their use within the Forecasting code. 2. A new class, DataByFeatureId, has been created to hold shared code previously located in FeatureData. FeatureData now inherits from DataByFeatureId. The standalone usage of DataByFeatureId will be detailed in subsequent PRs. 3. Renamed checkAnomalyDetectorFeaturesSyntax to checkFeaturesSyntax in RestHandlerUtils to make it more generic and usable for Forecasting. 4. Constants AGG_NAME_MAX_TIME, AGG_NAME_MIN_TIME, DATE_HISTOGRAM, and FEATURE_AGGS have been moved from ADCommonName to CommonName to make them accessible for Forecasting. 5. A new method, parseConfig, has been added to the Config class. This method can parse configuration for either AnomalyDetector or Forecaster based on input. Testing: The changes have been validated with a successful gradle build. Signed-off-by: Kaituo Li * address Amit and Owais's comments Signed-off-by: Kaituo Li --------- Signed-off-by: Kaituo Li commit 02c120f025f73f3d45a8477c7ef8b4873befebfa Author: Kaituo Li Date: Mon Jun 12 11:54:39 2023 -0700 Updated Maintainers and CODE_OWNERS list (#926) Signed-off-by: Kaituo Li commit 2ea4dcf430b40a710a0babb04881eeff9fb2d689 Author: Kaituo Li Date: Thu Jun 8 15:11:44 2023 -0700 Add Forecaster class (#920) * Add Forecaster class This PR adds class Forecaster that serves as the configuration POJO for forecasting. Shared code between AnomalyDetector and Forecaster is extracted and moved to the Config class to reduce duplication and promote reusability. References to the common code in related classes have also been adjusted. Testing done: 1. gradle build. Signed-off-by: Kaituo Li * fix compiler error due to a recent core change Signed-off-by: Kaituo Li * address Sudipto and Amit's comments Signed-off-by: Kaituo Li --------- Signed-off-by: Kaituo Li commit ee04225349a5f0f003d56e77a3243386f4a9673d Author: Kaituo Li Date: Fri Jun 2 15:42:19 2023 -0700 Refactor: Migrate files from 'ad' to 'timeseries' package (#918) * Refactor: Migrate files from 'ad' to 'timeseries' package This commit involves moving various files from the 'ad' package to the 'timeseries' package. All changes to dependent imports were handled automatically through the IDE, and no changes to file contents were made in the process. Testing done: - gradle build Signed-off-by: Kaituo Li * move ParseUtils to util package Signed-off-by: Kaituo Li --------- Signed-off-by: Kaituo Li commit 472868d165c2234eece8827584f64086aaf7ae92 Author: Kaituo Li Date: Thu Jun 1 11:17:16 2023 -0700 Refactor Shared Settings, Introduce Dynamic Forecast Settings, Rename (#912) * Refactor Shared Settings, Introduce Dynamic Forecast Settings, and Rename Checkpoint mapping File This commit has undertaken three changes: * Refactoring of Shared Settings: I've performed a refactoring to unify the settings that are shared between the AD and forecasting modules. * Introduction of Dynamic Forecast Settings: To increase the adaptability of our system, I've implemented dynamic forecast settings. These settings mirror the existing structure of the AD dynamic settings and will enable us to adjust forecast settings on-the-fly. * Renaming of Checkpoint File: To enhance the consistency across our AD mapping files, I've renamed checkpoint.json to anomaly-checkpoint.java. Testing done: 1. added tests for new settings. Signed-off-by: Kaituo Li * name change and delete files Signed-off-by: Kaituo Li --------- Signed-off-by: Kaituo Li commit 04ab91e60e9d71765a7b9df10b86e0e53ba4d715 Author: Kaituo Li Date: Wed May 31 12:18:54 2023 -0700 add 2.8 release notes (#915) Signed-off-by: Kaituo Li commit 98da8dffe8d336793ae35658015650b0f83aeef1 Author: Kaituo Li Date: Wed May 24 14:37:38 2023 -0700 Implementing Multiple Imputation Methods for Missing Values (#907) * Implementing Multiple Imputation Methods for Missing Values This Pull Request introduces multiple imputation methods to handle missing data in Anomaly Detection (AD) and forecasting. The ability to handle missing data is crucial to improve the robustness and accuracy of our models. The following imputation methods have been implemented: * Zero Imputation (ZERO): This method replaces all missing values with 0's. It's a simple approach, but it may introduce bias if the data is not centered around zero. * Fixed Values Imputation (FIXED_VALUES): This method replaces missing values with a predefined set of values. The values are the same for each input dimension, and they need to be specified by the user. * Previous Value Imputation (PREVIOUS): This method replaces missing values with the last known value in the respective input dimension. It's a commonly used method for time series data, where temporal continuity is expected. * Linear Interpolation (LINEAR): This method estimates missing values by interpolating linearly between known values in the respective input dimension. This method assumes that the data follows a linear trend. These methods are designed to provide a set of options for users to handle missing data based on their specific needs and the nature of their data. Testing Done: The code changes in this pull request have been validated through a gradle build to ensure that all new and existing tests pass successfully. Signed-off-by: Kaituo Li * Various changes The commit makes all new files use the shortened header text and addresses compiler/test issues due to core refactoring Signed-off-by: Kaituo Li --------- Signed-off-by: Kaituo Li commit da608aaa3d3f955efe32b0cb33b3c8cfdd7a7718 Author: Kaituo Li Date: Tue May 16 11:37:34 2023 -0700 Code Refactoring for CommonMessages (#902) * Code Refactoring for CommonMessages In this pull request, I have refactored the code related to shared log or error messages in both AD and forecasting modules to CommonMessages. Additionally, the previously used CommonErrorMessages has been renamed to ADCommonMessages. For the Forecasting module, I have introduced new names in ForecastCommonMessages. Testing done: * gradle build Signed-off-by: Kaituo Li * Update string constants wording Signed-off-by: Kaituo Li * improve wording and remove redundant variables Signed-off-by: Kaituo Li --------- Signed-off-by: Kaituo Li commit 041d6ce73d37f1697013b2d2f0683fbfe012247c Author: Kaituo Li Date: Fri May 12 13:05:56 2023 -0700 Code Refactoring for CommonName (#901) In this pull request, I have refactored the code related to shared names in both AD and forecasting modules to CommonNames. Additionally, the previously used CommonName has been renamed to ADCommonName. For the Forecasting module, I have introduced new names in ForecastCommonNames. Testing done: * gradle build Signed-off-by: Kaituo Li commit e2fee9d6ba59cc2a23eadbb50a80414fcc7dbf8f Author: Kaituo Li Date: Wed May 10 17:26:31 2023 -0700 Update outdated MAINTAINERS and OWNERS list (#898) Signed-off-by: Kaituo Li commit 98b566d1ec57ffd8970b5180c54b493deb1a24ac Author: Daniel (dB.) Doubrovkine Date: Wed May 10 18:03:04 2023 -0400 Sync up CODEOWNERS with MAINTAINERS.md. (#897) Signed-off-by: dblock commit 1f8415aa39fa4ec3ba016214255aa0e79b7c7d63 Author: Kaituo Li Date: Tue May 9 12:55:35 2023 -0700 Various fixes (#886) * Various fixes This pull request addresses several issues related to the compiler and tests in the current main branch. The first major change involves replacing ImmutableOpenMap with java.util.Map in the core. This modification is implemented in the following pull requests: https://github.com/opensearch-project/OpenSearch/pull/7165 https://github.com/opensearch-project/OpenSearch/pull/7301 To accommodate this change, the codebase of the AD (Anomaly Detection) module has been refactored to utilize JDK maps. As a consequence of this change, passing null to the custom parameters of ClusterState is no longer permissible, as it leads to a NullPointerException. The error stack trace is as follows: java.lang.NullPointerException: Cannot invoke "Object.getClass()" because "m" is null at __randomizedtesting.SeedInfo.seed([60CDDB34427ACD0C:6E72DB4ED18E018D]:0) at java.base/java.util.Collections.unmodifiableMap(Collections.java:1476) at org.opensearch.cluster.ClusterState.(ClusterState.java:219) at org.opensearch.ad.transport.DeleteAnomalyDetectorTests.createClusterState(DeleteAnomalyDetectorTests.java:216) at org.opensearch.ad.transport.DeleteAnomalyDetectorTests.testDeleteADTransportAction_LatestDetectorLevelTask(DeleteAnomalyDetectorTests.java:160) To address this issue, we have replaced the usage of null with new HashMap<>(). The second change in this pull request aligns with the modifications introduced in PR https://github.com/opensearch-project/anomaly-detection/pull/878. The third issue is related to the incompatibility between tests that utilize the @Parameters annotation and those that do not, as explained in https://tinyurl.com/2y265s2w. Specifically, the SearchFeatureDaoTests class runes tests with the @Parameters annotation, whereas SearchFeatureDao tests do not. Testing done: 1. gradle build. Signed-off-by: Kaituo Li commit 0a48da9e1a0cd70c64cc2da228674baaf377c89e Author: Jackie Han Date: Mon Apr 17 12:44:06 2023 -0700 Add 2.7 release notes (#871) Signed-off-by: Jackie Han commit 1edc2ef1ccfe20938711984744b20d1f2ec96a3c Author: Jackie Han Date: Tue Apr 11 17:53:57 2023 -0700 Update OpenSearchRejectedExecutionException import (#861) * Update OpenSearchRejectedExecutionException import Signed-off-by: Jackie Han * Update OpenSearchRejectedExecutionException import Signed-off-by: Jackie Han * Update OpenSearchRejectedExecutionException import Signed-off-by: Jackie Han * Update NodeInfo constructor and javadoc for isAdmin method Signed-off-by: Jackie Han --------- Signed-off-by: Jackie Han commit b8df172c2364997c446096e960068d6fc5d2e6f6 Author: Amit Galitzky Date: Tue Apr 11 10:55:28 2023 -0700 Giving admin priority over backendrole filtering (#850) * giving admin priority over backend role filtering Signed-off-by: Amit Galitzky * fix security tests Signed-off-by: Amit Galitzky * remove redundent line in test case Signed-off-by: Amit Galitzky --------- Signed-off-by: Amit Galitzky commit fd2fd5d5f63c9004b9f265e3c1e4475282447da8 Author: Daniel (dB.) Doubrovkine Date: Thu Mar 16 14:30:32 2023 -0400 Created untriaged issue workflow. (#809) Signed-off-by: dblock commit fd547014fdde5114bbc9c8e49fe7aaa37eb6e793 Author: Kaituo Li Date: Wed Mar 1 16:41:05 2023 -0800 Publish snapshots to maven via GHA, fix spotless errors, and bump bwc version (#827) We are de-coupling the task of publishing the maven snapshots from centralized build workflow to individual repositories. This PR publishes maven snapshots using GitHub Actions. Testing done: 1. ran ./gradlew publishPluginZipPublicationToSnapshotsRepository according to https://github.com/opensearch-project/anomaly-detection/issues/813 Signed-off-by: Kaituo Li commit b6be5960fb8cff41f0e9d13d5f5a971c289b80ed Author: Jackie Han Date: Thu Feb 23 12:23:01 2023 -0800 Update xcontent imports (#822) Signed-off-by: Jackie Han commit 1dcb78a6c4d1331133aefe5af7adf137daa8846f Author: Jackie Han Date: Wed Feb 22 11:17:21 2023 -0800 Add 2.6 release note (#818) Signed-off-by: Jackie Han commit 553688aae45590c37cc4d86102642d0b3a6ebb33 Author: Kaituo Li Date: Thu Feb 9 13:38:14 2023 -0800 Revert changes to exception message (#803) * Revert changes to exception message We used to log "No RCF models are available either because RCF models are not ready..." and now we log "resource_not_found_exception: No checkpoints found for model id...". This change caused an integration test failure. This PR reverts it back. Testing done: 1. verified that the original log is back. Signed-off-by: Kaituo Li * check NO_MODEL_ERR_MSG instead of NO_CHECKPOINT_ERR_MSG Signed-off-by: Kaituo Li --------- Signed-off-by: Kaituo Li commit c9bdc628a03c4ced3cd1ebeec44ebff792d7ff98 Author: Kaituo Li Date: Fri Feb 3 11:15:37 2023 -0800 fixing dls/fls logic around numeric aggregations (#801) Signed-off-by: Kaituo Li commit 9cebc5bc97a84dd78ac8892863ff7d674428bced Author: Amit Galitzky Date: Tue Jan 10 02:17:04 2023 +0000 adding 2.5 release notes Signed-off-by: Amit Galitzky commit 2ace84643fd199a5e5179e6d21ab76ae212cb7d8 Author: Ryan Bogan <10944539+ryanbogan@users.noreply.github.com> Date: Tue Jan 10 13:31:49 2023 -0800 Add jackson back to build.gradle (#780) Signed-off-by: Ryan Bogan commit a5d60eefab5ac30ef5243da8e7a88b7a2189cd5c Author: Kaituo Li Date: Mon Jan 9 15:49:35 2023 -0800 bump bwc to 2.6.0 (#775) We need to bump the version because only the latest 2.x will be upgradable to 3.0 when that ship. Since OpenSearch core bumped to 2.6, we need to do that as well. Note after bumping version, bwc still fails to find 2.6.0 core zip. Signed-off-by: Kaituo Li commit 27ed49863bfbef90551cdfacc6b51703bf622e25 Author: Kaituo Li Date: Mon Jan 9 12:17:57 2023 -0800 Fix the discrepancy between Profile API and real time tasks API (#770) * Fix the discrepancy between Profile API and real time tasks API This PR fixes the discrepancy by querying the result index when the total updates is less than 32. We have done similar things in profile API so I refactored reusable code to ProfileUtil. We also cached whether we have queried the result index and won't repeatedly issue the extra query. Testing done: 1. repeated repro steps in https://github.com/opensearch-project/anomaly-detection/issues/502 and verified the issue has been resolved. Signed-off-by: Kaituo Li commit b49a36b3e7962a205695d451e0497ee4f57fb2cb Author: Daniel (dB.) Doubrovkine Date: Thu Jan 5 10:19:51 2023 -0800 Updated MAINTAINERS.md to match recommended opensearch-project format. (#771) Signed-off-by: dblock commit 20d2c5f94811f62a7077142ba505bed272a83549 Author: Amit Galitzky Date: Tue Dec 20 13:16:21 2022 -0800 update bwc to 2.5 (#765) Signed-off-by: Amit Galitzky commit e08314f43a22d2692e8467efae3699fce0440611 Author: Amit Galitzky Date: Tue Dec 20 11:00:45 2022 -0800 [Forward-Port to main] Fix _source bug (#749) (#764) * Fix _source bug (#749) Signed-off-by: Amit Galitzky * fixed strings method Signed-off-by: Amit Galitzky commit 134942ed61b984e57b1f9c6bc7d416232d8f59d2 Author: Kaituo Li Date: Fri Dec 16 09:38:46 2022 -0800 Speed up cold start (#753) * Speed up cold start If historical data is enough, a single stream detector takes 1 interval for cold start to be triggered + 1 interval for the state document to be updated. Similar to single stream detectors, HCAD cold start needs 2 intervals and one more interval to make sure an entity appears more than once. So HCAD needs three intervals to complete cold starts. Long initialization is the single most complained problem of AD. This PR reduces both single stream and HCAD detectors' initialization time to 1 minute by * delaying real time cache update by one minute when we receive ResourceNotFoundException in single stream detectors or when the init progress of HCAD real time cache is 0. Thus, we can finish the cold start and writing checkpoint one minute later and update the state document accordingly. This optimization saves one interval to wait for the state document update. * disable the door keeper by default so that we won't have to wait an extra interval in HCAD. * trigger cold start when starting a real time detector. This optimization saves one interval to wait for the cold start to be triggered. Testing done: * verified the cold start time is reduced to 1 minute. * added tests for new code. Signed-off-by: Kaituo Li commit 609abe42df8af450e48bfa122746361f7a8201c7 Author: Daniel (dB.) Doubrovkine Date: Thu Dec 15 17:11:10 2022 -0500 Fix: typo in ohltyler. (#760) Signed-off-by: dblock commit ce3d747317aec5ccef080c767d4ba8e7015c5a2a Author: Varun Jain Date: Tue Dec 6 15:44:33 2022 -0800 Model Profile Test (#748) commit 09b4cc1ac3ab9fb4e589754acba619d265603580 Author: Amit Galitzky Date: Mon Dec 5 12:06:16 2022 -0800 Use sonatype to fetch JS instead of ci.opensearch.org (#740) Signed-off-by: Amit Galitzky commit a3e2b994ed347263a8dbaf5beed6bd5ddd6931a4 Author: Kaituo Li Date: Wed Nov 30 16:49:23 2022 -0800 AD model performance benchmark (#728) * AD model performance benchmark This PR adds an AD model performance benchmark so that we can compare model performance across versions. For the single stream detector, we refactored tests in DetectionResultEvalutationIT and moved it to SingleStreamModelPerfIT. For the HCAD detector, we randomly generated synthetic data with known anomalies inserted throughout the signal. In particular, these are one/two/four dimensional data where each dimension is a noisy cosine wave. Anomalies are inserted into one dimension with 0.003 probability. Anomalies across each dimension can be independent or dependent. We have approximately 5000 observations per data set. The data set is generated using the same random seed so the result is comparable across versions. We also backported #600 so that we can capture the performance data in CI output. We also fixed #712 by revising the client setup code. Testing done: * added unit tests to run the benchmark. Signed-off-by: Kaituo Li commit d8f0c355c026bd60b991a055f5d35dc0c8f654bd Author: Owais Kazi Date: Wed Nov 30 09:53:05 2022 -0800 Added coalesceToEmpty method (#736) Signed-off-by: Owais Kazi commit 7cefb143d61178223eec0dc1aa5aa2a16534316e Author: Kaituo Li Date: Tue Nov 8 10:19:50 2022 -0800 remove bwc code related to 1.0 and earlier version (#714) This PR addresses AD compile failure because the core removed 1x Version constants. (check https://github.com/opensearch-project/OpenSearch/pull/5021) OpenSearch does not support N-2 version compatibility. This is inherited from Elasticsearch and Lucene. So version 1x is not compatible with 3.0. Thus removal of deprecated 1x code. This PR follows suite and removes backward compatibility code on OpenSearch 1.0 and older versions. So we won't support direct upgrade from 1.x domains to 3.x. Testing done: 1. gradle build. Note that CI workflow will fail due to missing job scheduler. We are using a job scheduler from distribution. But due to a circular dependency on -SNAPSHOT builds being published to maven that require the distribution build to be successful (check https://github.com/opensearch-project/opensearch-build/issues/1463), AD compilation failure caused the job scheduler to be taken out. Not including the latest job scheduler will cause AD build to fail. So we have a chicken and egg problem: this PR fixes AD build failure and the PR itself cannot build due to missing job scheduler. To run gradle build, I changed to use local job scheduler and verified gradle build succeeded. Once I merge this PR. job scheduler should be able to build and be published to maven. Future AD CI should be unblocked after that. Signed-off-by: Kaituo Li commit bcfe952f0bfced908ce1c5b9547d23bd77cbb9f4 Author: Amit Galitzky Date: Mon Nov 7 09:00:56 2022 -0800 Fixing Docker CI for security enabled tests (#710) Signed-off-by: Amit Galitzky commit 06e5edb68686bb8ca9dfb63f7a9c92b0b44bc295 Author: Amit Galitzky Date: Fri Oct 28 11:44:40 2022 -0700 windows CI for AD (#703) Signed-off-by: Amit Galitzky commit c43ddbfce5ba5ee179b082fbeacd0c12422a71cb Author: Jackie Han Date: Thu Oct 27 10:55:41 2022 -0700 Add 2.4 release notes (#699) Signed-off-by: Jackie Han Signed-off-by: Jackie Han commit 7a05826a9f26ebd1e7c9bc8c821e7034e453562d Author: Jackie Han Date: Fri Oct 21 15:04:02 2022 -0700 Fixing issues caused by 3.0 version bumping (#696) * Update BaseNodeRequest dependency Signed-off-by: Jackie Han * Migrate client transports to Apache HttpClient / Core 5.x Signed-off-by: Jackie Han * Address spotlessCheck failures Signed-off-by: Jackie Han * Upgrade bwc version because 1.1.0 cannot be upgraded directly to version 3.0.0 Signed-off-by: Jackie Han * update bwc version to 2.4 Signed-off-by: Jackie Han * Update bwc version Signed-off-by: Jackie Han * update bwc download link Signed-off-by: Jackie Han * use latest 2.4 build for bwc test Signed-off-by: Jackie Han * move bwc constants to build script block Signed-off-by: Jackie Han Signed-off-by: Jackie Han commit 24232aea08cf9217600b4ec7c00742b078c99a47 Author: Tyler Ohlsen Date: Fri Oct 21 13:32:02 2022 -0700 Bump jackson-databind to 2.13.4.2 (#697) Signed-off-by: Tyler Ohlsen commit e5e15a6f9acfa9588abeefc8a6666bbb5f602488 Author: Jackie Han Date: Mon Oct 17 15:46:55 2022 -0700 bump version to 3.0.0 (#693) Signed-off-by: Jackie Han Signed-off-by: Jackie Han commit 16cee09f557b4c65edfd8961b25f200e466a01de Author: Prudhvi Godithi Date: Thu Oct 13 09:26:16 2022 -0700 add group = org.opensearch.plugin (#690) Signed-off-by: prudhvigodithi commit 27942cb37a73cd5f48764f14a72670bdda1c30a3 Author: Amit Galitzky Date: Mon Oct 3 15:16:38 2022 -0700 update jackson dependency version (#678) Signed-off-by: Amit Galitzky commit fd718445fc65934c8e2ff074194ff1f000cae952 Author: Amit Galitzky Date: Sun Sep 25 11:25:04 2022 -0700 Fix window delay test (#674) Signed-off-by: Amit Galitzky commit 426f713b8043c93494189271138e4c5b1ce4593f Author: Leonidas Spyropoulos Date: Thu Sep 22 19:48:35 2022 +0100 Add support for reproducible builds (#579) As per gradle [docs] add support to remove timestamps and package with same order which is required from [reproducible] builds [docs]: https://docs.gradle.org/current/userguide/working_with_files.html#sec:archives [reproducible]: https://reproducible-builds.org/ Signed-off-by: Leonidas Spyropoulos commit 397ebcc052110f9d3db1d312fc28b38c8530377e Author: Owais Kazi Date: Wed Sep 14 10:31:23 2022 -0700 Removed Github DCO action since DCO runs via Github App now (#664) Signed-off-by: Owais Kazi commit 19a6c84ad1b8048a571b7a1a1345e646f587508d Author: Tyler Ohlsen Date: Wed Sep 7 12:01:25 2022 -0700 Add 2.3 release notes (#660) * Add 2.3 release notes Signed-off-by: Tyler Ohlsen * Add other documentation PR Signed-off-by: Tyler Ohlsen Signed-off-by: Tyler Ohlsen commit 06d53fb28c12366ea29d88d2186d55c7295544c2 Author: Tyler Ohlsen Date: Wed Sep 7 12:01:12 2022 -0700 Bump to version 2.3 (#658) Signed-off-by: Tyler Ohlsen Signed-off-by: Tyler Ohlsen commit d9d956dc138d66981a94765ad8229a9eab6b8c35 Author: Chris Moore <107723039+cwillum@users.noreply.github.com> Date: Wed Sep 7 11:41:03 2022 -0700 “fix#921-README-forum-link-AD” (#659) Signed-off-by: cwillum Signed-off-by: cwillum commit 884573fc4d7343ddea6c7b8d638234a2805b87c2 Author: Amit Galitzky Date: Wed Sep 7 10:56:07 2022 +0300 Removed additional non-inclusive terms (#644) Signed-off-by: Amit Galitzky commit bd063bd0778400161ed1e75e3314ec7bb67e31af Author: Rishikesh Pasham <62345295+Rishikesh1159@users.noreply.github.com> Date: Fri Sep 2 20:12:25 2022 +0000 Adding external property customDistributionUrl to let developer override default distribution Download url (#380) * Adding uasage and external property customDistributionUrl to let developer override default distribution Download url Signed-off-by: Rishikesh1159 * Adding doc and removing system property from build.gradle Signed-off-by: Rishikesh1159 Signed-off-by: Rishikesh1159 commit c92cdc82fb2aef58b3c0b95fafa90da81bfc24cf Author: Kaituo Li Date: Tue Aug 9 09:36:08 2022 -0700 Fix PowerMock related tests due to version bump (#637) 2.2 added a new dependency of ClusterSettings like TaskResourceTrackingService. It is possible that the classloader that loadsTaskResourceTrackingService is different from the classloader that loads PowerMock and related dependencies. PowerMock reports java.lang.NoClassDefFoundError when initializing ClusterSettings. Since we are not actually using the value of ClusterSettings in the tests, we make it null to avoid initializing it. The change fixed failed tests. This PR also fixed spotless errors in FakeNode due to recent changes. Testing done: * gradle build Signed-off-by: Kaituo Li commit 103f03492681ce999fb87d13ed0218461b084c96 Author: Tyler Ohlsen Date: Mon Aug 8 17:14:30 2022 -0400 Fix taskmanager compilation error in FakeNode (#634) Signed-off-by: Tyler Ohlsen commit 01bbfc525b93068145f09e3e468dec23d1bbea26 Author: Amit Galitzky Date: Fri Aug 5 16:30:29 2022 -0700 2.2 release notes (#631) Signed-off-by: Amit Galitzky commit 12f9f2e337ee81c4882bc02fb68f631ea9ab1430 Author: Tyler Ohlsen Date: Thu Aug 4 20:25:45 2022 -0400 Bump version to 2.2 (#627) Signed-off-by: Tyler Ohlsen commit 7f846016c92800383ed848936ed82251108c8947 Author: Tyler Ohlsen Date: Thu Aug 4 19:46:25 2022 -0400 Update BWC zip links (#625) Signed-off-by: Tyler Ohlsen commit 08fdbdde4c2f97c27f63d3a4ba31f8acd6dbf570 Author: Kaituo Li Date: Wed Aug 3 16:40:17 2022 -0700 make 1M1min possible (#620) * make 1M1min possible This PR improves performance to make the 1M1min experiment possible. First, I changed coordinating node pagination from sync to async mode in AnomalyResultTransportAction so that the coordinating node does not have to wait for model nodes' responses before fetching the next page. Second, during the million-entity evaluation, CPU is mostly around 1% with hourly spikes up to 65%. An internal hourly maintenance job can account for the spike due to saving hundreds of thousands of model checkpoints, clearing unused models, and performing bookkeeping for internal states. This PR evens out the resource usage more fairly across a large maintenance window by introducing CheckpointMaintainWorker. Third, during a model corruption, I retrigger cold start for mitigation. Check ModelManager.score, EntityResultTransportAction, and CheckpointReadWorker. Testing done: 1. Added unit tests. 2. Manually confirmed 1M1min is possible after the above changes. Signed-off-by: Kaituo Li commit eb7bd07774658c7e0216e5436cfc613b15547d3d Author: Prudhvi Godithi Date: Mon Aug 1 15:35:09 2022 -0400 Staging for version increment automation (#608) Signed-off-by: pgodithi commit 9f6a5abb9652fb010decde8778c56c292eac90e2 Author: Amit Galitzky Date: Wed Jul 13 10:07:23 2022 -0700 fix zip fetching issue on version increment (#611) Signed-off-by: Amit Galitzky commit f630c8f97070e35a50fca8a7725767286c548a47 Author: Kaituo Li Date: Wed Jun 29 10:41:57 2022 -0700 Expose model accuracy metrics in tests (#600) * Expose model accuracy metrics in tests This PR adds an option flag to print logs during tests and turn on the flag in CI workflow. The flag is disabled by default. By doing this, we can record model accuracy metrics in git workflows and later retrieve it for analysis. Testing done: 1. We can turn on/off logs during tests. 2. The accuracy logs are recorded. Signed-off-by: Kaituo Li commit d484f9baa9aedf0a48215a82043107cd23c50c0b Author: Tyler Ohlsen Date: Tue Jun 28 16:00:56 2022 -0400 Add 2.1.0 release notes (#597) Signed-off-by: Tyler Ohlsen commit 6e373a7452f2d38795c5b73d77282a5239382915 Author: Amit Galitzky Date: Mon Jun 27 16:01:32 2022 -0700 adding custom plugin to upload ad zip to maven (#594) Signed-off-by: Amit Galitzky commit 0bd6e0e6aa933e6cd6ada8dfe88762b6e0db441a Author: Amit Galitzky Date: Mon Jun 27 13:59:36 2022 -0700 Update ingestion (#592) Signed-off-by: Amit Galitzky commit 4d7a8a452524e47b10fdba24f60c83f3103571ae Author: Amit Galitzky Date: Fri Jun 24 16:46:04 2022 -0700 Adding HCAD data ingestion script to AD (#585) Signed-off-by: Amit Galitzky commit c6f9b20334a4d280defe7f4bd680a69437358ba1 Author: Amit Galitzky Date: Fri Jun 24 13:06:29 2022 -0700 Cluster manager revert fix (#584) Signed-off-by: Amit Galitzky commit 0e7079e31ffab5439ff386fbd1a52e734a5d6500 Author: Amit Galitzky Date: Thu Jun 23 15:38:22 2022 -0700 2.1 version bump and Gradle bump (#582) Signed-off-by: Amit Galitzky commit 03e04d70556a2785782bbfd1603f3c19fbd8c895 Author: Kaituo Li Date: Wed Jun 22 11:55:00 2022 -0700 Disable interpolation in HCAD cold start (#575) * Disable interpolation in HCAD cold start Previously, we used interpolation in HCAD cold start for the purpose of efficiency. This caused problems for model accuracy. This PR removes interpolation in the cold start step. Testing done: 1. added unit tests to verify precision boosted. Signed-off-by: Kaituo Li commit 7f3820a9bca636b76f6f85601ad8439e1b55a0d2 Author: Tyler Ohlsen Date: Tue Jun 14 11:11:57 2022 -0700 Add 2.0.1 release notes (#572) Signed-off-by: Tyler Ohlsen commit 5f4f9cd9957e72f7844ad8dd66dd6ee2ad8615e0 Author: Amit Galitzky Date: Thu Jun 9 18:13:11 2022 +0000 bump rcf to 3.0-rc3 Signed-off-by: Amit Galitzky commit 17bf3f83e37735f9f9562ae51530123cfb7bb252 Author: Kaituo Li Date: Mon May 23 14:24:19 2022 -0700 Use current time as training data end time (#547) (#557) * Use current time as training data end time The bug happens because we use job enabled time as training data end time. But if the historical data before that time is deleted or does not exist at all, cold start might never finish. This PR uses current time as the training data end time so that cold start has a chance to succeed later. This PR also removes the code that combines cold start data and existing samples in EntityColdStartWorker because we don't add samples until cold start succeeds. Combining cold start data and existing samples is thus unnecessary. Testing done: 1. manually verified the bug is fixed. 2. fixed all related unit tests. Signed-off-by: Kaituo Li commit 1ada5d73d0e96a58cc8c461cf8b4d757b762f348 Author: aksingh-es Date: Mon May 16 16:13:31 2022 -0700 GA release notes (#550) Signed-off-by: aksingh-es commit dbd311a9ed6022b6d42db8ea432cc478d29a0947 Author: Amit Galitzky Date: Thu May 12 10:03:33 2022 -0700 remove rc1 qualifier and _type from test (#543) Signed-off-by: Amit Galitzky commit 42b7f50890ad0e60e93ef6aa5c5389621edcf7ef Author: Xun Zhang Date: Tue May 3 13:45:21 2022 -0700 Increase more coverage and reduce jacocoExclusions (#533) Signed-off-by: Xun Zhang commit 8227e32b692dc137112f6cc42b475a465aeec347 Author: Amit Galitzky Date: Tue May 3 10:31:16 2022 -0700 bump rcf to 3.0-rc2.1 (#519) Signed-off-by: Amit Galitzky commit b17d483610826fa283639b2218050c05cef3d257 Author: Xun Zhang Date: Wed Apr 27 14:57:21 2022 -0700 refactor SearchADResultTransportAction to be more testable (#517) Signed-off-by: Xun Zhang commit be319d1a7712b9d6634b6bba06dbff399cd44c98 Author: AMIT KUMAR SINGH Date: Thu Apr 21 10:57:45 2022 -0700 Fix OS version in notes (#512) Signed-off-by: aksingh-es commit a44ffec2b3cd6de3f92ec2c1f7994b1cbbf31485 Author: AMIT KUMAR SINGH <98787372+aksingh-es@users.noreply.github.com> Date: Wed Apr 20 17:28:15 2022 -0700 Re-enable Tests for CI workflows (#509) * Re enable tests for CI Signed-off-by: aksingh-es * reenable CI tests Signed-off-by: aksingh-es * fixing minor issue Signed-off-by: aksingh-es * fixing minor issue for CI Signed-off-by: aksingh-es commit d874960d7afe77bb48796ced2c5b214ca5303d8c Author: Tyler Ohlsen Date: Wed Apr 20 12:39:57 2022 -0700 Update labeler to default backport to 2.x (#507) Signed-off-by: Tyler Ohlsen Signed-off-by: Kaituo Li --- .github/CODEOWNERS | 4 +- .github/labeler.yml | 2 +- .github/workflows/auto-release.yml | 2 +- .github/workflows/backport.yml | 2 +- .github/workflows/labeler.yml | 2 +- .../workflows/test_build_multi_platform.yml | 2 +- .github/workflows/test_security.yml | 15 +- MAINTAINERS.md | 38 +- build.gradle | 358 ++-- gradle.properties | 30 + gradle/wrapper/gradle-wrapper.jar | Bin 63721 -> 43462 bytes gradle/wrapper/gradle-wrapper.properties | 1 + ...aly-detection.release-notes-2.0.0.0-rc1.md | 35 + .../ad/AnomalyDetectorJobRunner.java | 75 +- .../ad/AnomalyDetectorProfileRunner.java | 116 +- .../opensearch/ad/AnomalyDetectorRunner.java | 37 +- .../opensearch/ad/EntityProfileRunner.java | 60 +- .../ad/ExecuteADResultResponseRecorder.java | 77 +- .../java/org/opensearch/ad/ProfileUtil.java | 15 +- .../opensearch/ad/caching/CacheBuffer.java | 16 +- .../org/opensearch/ad/caching/DoorKeeper.java | 4 +- .../opensearch/ad/caching/EntityCache.java | 6 +- .../opensearch/ad/caching/PriorityCache.java | 54 +- .../ad/caching/PriorityTracker.java | 2 +- .../opensearch/ad/cluster/ADDataMigrator.java | 72 +- .../opensearch/ad/cluster/ADVersionUtil.java | 9 +- .../cluster/ClusterManagerEventListener.java | 4 +- .../org/opensearch/ad/cluster/DailyCron.java | 12 +- .../org/opensearch/ad/cluster/HashRing.java | 24 +- .../org/opensearch/ad/cluster/HourlyCron.java | 2 +- .../ad/cluster/diskcleanup/IndexCleanup.java | 2 +- .../ModelCheckpointIndexRetention.java | 16 +- .../ad/common/exception/ClientException.java | 34 - .../ad/common/exception/InternalFailure.java | 35 - .../ad/constant/ADCommonMessages.java | 55 + .../{CommonName.java => ADCommonName.java} | 49 +- .../ad/constant/CommonErrorMessages.java | 137 -- ...ingleFeatureLinearUniformInterpolator.java | 40 - .../ad/dataprocessor/Interpolator.java | 37 - .../LinearUniformInterpolator.java | 57 - ...ingleFeatureLinearUniformInterpolator.java | 75 - .../ad/feature/AbstractRetriever.java | 2 +- .../ad/feature/CompositeRetriever.java | 14 +- .../opensearch/ad/feature/FeatureManager.java | 63 +- .../org/opensearch/ad/indices/ADIndex.java | 44 +- .../ad/indices/ADIndexManagement.java | 275 +++ .../org/opensearch/ad/ml/CheckpointDao.java | 74 +- .../opensearch/ad/ml/EntityColdStarter.java | 105 +- .../org/opensearch/ad/ml/EntityModel.java | 2 +- .../org/opensearch/ad/ml/ModelManager.java | 23 +- .../java/org/opensearch/ad/ml/ModelState.java | 11 +- .../ml/TRCFMemoryAwareConcurrentHashmap.java | 8 +- .../opensearch/ad/ml/ThresholdingResult.java | 127 +- .../ad/model/ADEntityTaskProfile.java | 1 + .../java/org/opensearch/ad/model/ADTask.java | 506 +----- .../opensearch/ad/model/ADTaskProfile.java | 5 +- .../org/opensearch/ad/model/ADTaskType.java | 9 +- .../opensearch/ad/model/AnomalyDetector.java | 504 ++---- .../model/AnomalyDetectorExecutionInput.java | 5 +- .../opensearch/ad/model/AnomalyResult.java | 314 +--- .../ad/model/AnomalyResultBucket.java | 2 +- .../ad/model/DetectorInternalState.java | 4 +- .../opensearch/ad/model/DetectorProfile.java | 44 +- .../ad/model/DetectorProfileName.java | 48 +- .../ad/model/DetectorValidationIssue.java | 13 +- .../ad/model/DetectorValidationIssueType.java | 47 - .../opensearch/ad/model/EntityProfile.java | 14 +- .../ad/model/EntityProfileName.java | 24 +- .../ad/model/ExpectedValueList.java | 8 +- .../org/opensearch/ad/model/ModelProfile.java | 40 +- .../ad/model/ModelProfileOnNode.java | 4 +- .../opensearch/ad/ratelimit/BatchWorker.java | 10 +- .../CheckPointMaintainRequestAdapter.java | 2 +- .../ratelimit/CheckpointMaintainWorker.java | 18 +- .../ad/ratelimit/CheckpointReadWorker.java | 81 +- .../ad/ratelimit/CheckpointWriteWorker.java | 38 +- .../ad/ratelimit/ColdEntityWorker.java | 18 +- .../ad/ratelimit/ConcurrentWorker.java | 10 +- .../ad/ratelimit/EntityColdStartWorker.java | 19 +- .../ad/ratelimit/EntityFeatureRequest.java | 2 +- .../ad/ratelimit/EntityRequest.java | 2 +- .../ad/ratelimit/QueuedRequest.java | 2 +- .../ratelimit/RateLimitedRequestWorker.java | 30 +- .../ad/ratelimit/ResultWriteRequest.java | 2 +- .../ad/ratelimit/ResultWriteWorker.java | 32 +- .../ad/ratelimit/ScheduledWorker.java | 10 +- .../ad/ratelimit/SingleRequestWorker.java | 6 +- .../rest/AbstractAnomalyDetectorAction.java | 20 +- .../ad/rest/AbstractSearchAction.java | 10 +- .../ad/rest/RestAnomalyDetectorJobAction.java | 43 +- .../rest/RestDeleteAnomalyDetectorAction.java | 16 +- .../rest/RestDeleteAnomalyResultsAction.java | 14 +- .../RestExecuteAnomalyDetectorAction.java | 28 +- .../ad/rest/RestGetAnomalyDetectorAction.java | 55 +- .../rest/RestIndexAnomalyDetectorAction.java | 28 +- .../RestPreviewAnomalyDetectorAction.java | 24 +- .../ad/rest/RestSearchADTasksAction.java | 10 +- .../rest/RestSearchAnomalyDetectorAction.java | 11 +- .../RestSearchAnomalyDetectorInfoAction.java | 22 +- .../rest/RestSearchAnomalyResultAction.java | 20 +- .../RestSearchTopAnomalyResultAction.java | 16 +- .../rest/RestStatsAnomalyDetectorAction.java | 14 +- .../RestValidateAnomalyDetectorAction.java | 28 +- .../AbstractAnomalyDetectorActionHandler.java | 210 ++- .../handler/AnomalyDetectorActionHandler.java | 17 +- .../IndexAnomalyDetectorActionHandler.java | 8 +- .../IndexAnomalyDetectorJobActionHandler.java | 114 +- .../handler/ModelValidationActionHandler.java | 178 +- .../ValidateAnomalyDetectorActionHandler.java | 8 +- ...bledSetting.java => ADEnabledSetting.java} | 42 +- ...ericSetting.java => ADNumericSetting.java} | 16 +- .../ad/settings/AnomalyDetectorSettings.java | 341 ++-- ...gacyOpenDistroAnomalyDetectorSettings.java | 2 +- .../stats/suppliers/ModelsOnNodeSupplier.java | 13 +- .../opensearch/ad/task/ADBatchTaskCache.java | 22 +- .../opensearch/ad/task/ADBatchTaskRunner.java | 218 +-- .../ad/task/ADHCBatchTaskRunState.java | 4 +- .../ad/task/ADTaskCacheManager.java | 281 +-- .../org/opensearch/ad/task/ADTaskManager.java | 420 ++--- .../transport/ADBatchAnomalyResultAction.java | 2 +- .../ADBatchTaskRemoteExecutionAction.java | 2 +- .../ad/transport/ADCancelTaskAction.java | 2 +- .../ad/transport/ADCancelTaskNodeRequest.java | 4 +- .../ad/transport/ADCancelTaskRequest.java | 6 +- .../ADCancelTaskTransportAction.java | 6 +- .../ADResultBulkTransportAction.java | 24 +- .../ad/transport/ADTaskProfileAction.java | 2 +- .../transport/ADTaskProfileNodeRequest.java | 4 +- .../transport/ADTaskProfileNodeResponse.java | 4 +- .../ad/transport/ADTaskProfileRequest.java | 6 +- .../ADTaskProfileTransportAction.java | 2 +- .../transport/AnomalyDetectorJobAction.java | 5 +- .../transport/AnomalyDetectorJobRequest.java | 10 +- .../transport/AnomalyDetectorJobResponse.java | 71 - .../AnomalyDetectorJobTransportAction.java | 47 +- .../ad/transport/AnomalyResultRequest.java | 12 +- .../ad/transport/AnomalyResultResponse.java | 7 +- .../AnomalyResultTransportAction.java | 157 +- .../ad/transport/CronTransportAction.java | 2 +- .../DeleteAnomalyDetectorRequest.java | 4 +- .../DeleteAnomalyDetectorTransportAction.java | 43 +- .../DeleteAnomalyResultsTransportAction.java | 14 +- .../ad/transport/DeleteModelRequest.java | 8 +- .../transport/DeleteModelTransportAction.java | 2 +- .../ad/transport/EntityProfileRequest.java | 37 +- .../ad/transport/EntityProfileResponse.java | 24 +- .../EntityProfileTransportAction.java | 14 +- .../ad/transport/EntityResultRequest.java | 54 +- .../EntityResultTransportAction.java | 92 +- .../ad/transport/ForwardADTaskAction.java | 7 +- .../ad/transport/ForwardADTaskRequest.java | 40 +- .../ForwardADTaskTransportAction.java | 47 +- .../transport/GetAnomalyDetectorRequest.java | 2 +- .../transport/GetAnomalyDetectorResponse.java | 12 +- .../GetAnomalyDetectorTransportAction.java | 55 +- .../IndexAnomalyDetectorResponse.java | 2 +- .../IndexAnomalyDetectorTransportAction.java | 44 +- .../PreviewAnomalyDetectorRequest.java | 2 +- ...PreviewAnomalyDetectorTransportAction.java | 54 +- .../ad/transport/ProfileNodeRequest.java | 4 +- .../ad/transport/ProfileNodeResponse.java | 32 +- .../ad/transport/ProfileRequest.java | 2 +- .../ad/transport/ProfileResponse.java | 44 +- .../ad/transport/ProfileTransportAction.java | 8 +- .../ad/transport/RCFPollingRequest.java | 8 +- .../transport/RCFPollingTransportAction.java | 14 +- .../ad/transport/RCFResultRequest.java | 15 +- .../ad/transport/RCFResultResponse.java | 7 +- .../transport/RCFResultTransportAction.java | 14 +- .../SearchAnomalyDetectorInfoResponse.java | 2 +- ...rchAnomalyDetectorInfoTransportAction.java | 12 +- .../SearchAnomalyResultTransportAction.java | 7 +- .../SearchTopAnomalyResultRequest.java | 4 +- ...SearchTopAnomalyResultTransportAction.java | 39 +- .../StatsAnomalyDetectorTransportAction.java | 13 +- .../ad/transport/StopDetectorRequest.java | 8 +- .../StopDetectorTransportAction.java | 6 +- .../ad/transport/ThresholdResultRequest.java | 15 +- .../ad/transport/ThresholdResultResponse.java | 6 +- ...alidateAnomalyDetectorTransportAction.java | 48 +- .../ad/transport/handler/ADSearchHandler.java | 16 +- .../handler/AnomalyIndexHandler.java | 32 +- .../AnomalyResultBulkIndexHandler.java | 34 +- .../handler/MultiEntityResultHandler.java | 22 +- .../java/org/opensearch/ad/util/BulkUtil.java | 1 + src/main/java/org/opensearch/ad/util/Bwc.java | 32 - .../org/opensearch/ad/util/ClientUtil.java | 318 ---- .../org/opensearch/ad/util/IndexUtils.java | 23 +- .../org/opensearch/ad/util/Throttler.java | 73 - .../constant/ForecastCommonMessages.java | 62 + .../forecast/constant/ForecastCommonName.java | 48 + .../constant/ForecastCommonValue.java | 17 + .../forecast/indices/ForecastIndex.java | 72 + .../indices/ForecastIndexManagement.java | 271 +++ .../forecast/model/ForecastResult.java | 590 ++++++ .../forecast/model/ForecastTask.java | 389 ++++ .../forecast/model/ForecastTaskType.java | 69 + .../opensearch/forecast/model/Forecaster.java | 405 +++++ .../settings/ForecastEnabledSetting.java | 92 + .../settings/ForecastNumericSetting.java | 59 + .../forecast/settings/ForecastSettings.java | 389 ++++ .../opensearch/timeseries/AnalysisType.java | 11 + .../{ad => timeseries}/CleanState.java | 2 +- .../timeseries/ExceptionRecorder.java | 20 + .../{ad => timeseries}/ExpiringState.java | 2 +- .../{ad => timeseries}/MaintenanceState.java | 2 +- .../{ad => timeseries}/MemoryTracker.java | 153 +- .../opensearch/{ad => timeseries}/Name.java | 2 +- .../{ad => timeseries}/NodeState.java | 180 +- .../{ad => timeseries}/NodeStateManager.java | 398 +++-- .../TimeSeriesAnalyticsPlugin.java} | 400 +++-- .../annotation/Generated.java | 2 +- .../breaker/BreakerName.java | 2 +- .../breaker/CircuitBreaker.java | 2 +- .../breaker/CircuitBreakerService.java} | 16 +- .../breaker/MemoryCircuitBreaker.java | 2 +- .../breaker/ThresholdCircuitBreaker.java | 2 +- .../common/exception/ClientException.java | 34 + .../exception/DuplicateTaskException.java | 4 +- .../common/exception/EndRunException.java | 10 +- .../common/exception/InternalFailure.java | 35 + .../exception/LimitExceededException.java | 20 +- .../NotSerializedExceptionName.java} | 60 +- .../exception/ResourceNotFoundException.java | 12 +- .../exception/TaskCancelledException.java} | 6 +- .../exception/TimeSeriesException.java} | 41 +- .../exception/ValidationException.java} | 28 +- .../common/exception/VersionException.java} | 10 +- .../timeseries/constant/CommonMessages.java | 142 ++ .../timeseries/constant/CommonName.java | 116 ++ .../timeseries/constant/CommonValue.java | 12 + .../dataprocessor/FixedValueImputer.java | 47 + .../dataprocessor/ImputationMethod.java | 25 + .../dataprocessor/ImputationOption.java | 147 ++ .../timeseries/dataprocessor/Imputer.java | 48 + .../dataprocessor/LinearUniformImputer.java | 82 + .../dataprocessor/PreviousValueImputer.java | 42 + .../timeseries/dataprocessor/ZeroImputer.java | 21 + .../feature/SearchFeatureDao.java | 145 +- .../function/BiCheckedFunction.java | 11 + .../function/ExecutorFunction.java} | 4 +- .../function}/ThrowingConsumer.java | 2 +- .../function}/ThrowingSupplier.java | 2 +- .../function}/ThrowingSupplierWrapper.java | 2 +- .../indices/IndexManagement.java} | 1590 ++++++++--------- .../timeseries/indices/TimeSeriesIndex.java | 22 + .../timeseries/ml/IntermediateResult.java | 86 + .../ml/SingleStreamModelIdMapper.java | 2 +- .../opensearch/timeseries/model/Config.java | 575 ++++++ .../model/DataByFeatureId.java | 11 +- .../model/DateRange.java} | 18 +- .../{ad => timeseries}/model/Entity.java | 105 +- .../{ad => timeseries}/model/Feature.java | 8 +- .../{ad => timeseries}/model/FeatureData.java | 4 +- .../timeseries/model/IndexableResult.java | 258 +++ .../model/IntervalTimeConfiguration.java | 16 +- .../model/Job.java} | 29 +- .../model/MergeableList.java | 4 +- .../model/TaskState.java} | 20 +- .../opensearch/timeseries/model/TaskType.java | 17 + .../model/TimeConfiguration.java | 2 +- .../timeseries/model/TimeSeriesTask.java | 448 +++++ .../model/ValidationAspect.java | 17 +- .../timeseries/model/ValidationIssueType.java | 52 + .../settings/DynamicNumericSetting.java} | 16 +- .../settings/TimeSeriesSettings.java | 212 +++ .../{ad => timeseries}/stats/StatNames.java | 2 +- .../task/RealtimeTaskCache.java} | 10 +- .../timeseries/task/TaskCacheManager.java | 251 +++ .../transport/BackPressureRouting.java | 2 +- .../timeseries/transport/JobResponse.java | 48 + .../timeseries/util/ClientUtil.java | 68 + .../opensearch/timeseries/util/DataUtil.java | 48 + .../util/DiscoveryNodeFilterer.java | 8 +- .../util/ExceptionUtil.java | 12 +- .../MultiResponsesDelegateActionListener.java | 2 +- .../{ad => timeseries}/util/ParseUtils.java | 206 ++- .../util/RestHandlerUtils.java | 46 +- .../timeseries/util/SafeSecurityInjector.java | 87 + .../util/SecurityClientUtil.java | 40 +- .../{ad => timeseries}/util/SecurityUtil.java | 14 +- .../util/TimeSeriesSafeSecurityInjector.java} | 45 +- ...rch.jobscheduler.spi.JobSchedulerExtension | 2 +- src/main/resources/es-plugin.properties | 2 +- ...heckpoint.json => anomaly-checkpoint.json} | 0 .../{anomaly-detectors.json => config.json} | 0 .../mappings/forecast-checkpoint.json | 64 + .../resources/mappings/forecast-results.json | 131 ++ .../resources/mappings/forecast-state.json | 133 ++ .../{anomaly-detector-jobs.json => job.json} | 0 src/test/java/org/opensearch/BwcTests.java | 564 ------ .../opensearch/EntityProfileRequest1_0.java | 105 -- .../opensearch/EntityProfileResponse1_0.java | 172 -- .../opensearch/EntityResultRequest1_0.java | 105 -- .../java/org/opensearch/ModelProfile1_0.java | 114 -- .../opensearch/ProfileNodeResponse1_0.java | 134 -- .../org/opensearch/ProfileResponse1_0.java | 169 -- .../org/opensearch/RCFResultResponse1_0.java | 87 - .../opensearch/StreamInputOutputTests.java | 293 +++ ...ndexAnomalyDetectorActionHandlerTests.java | 41 +- ...dateAnomalyDetectorActionHandlerTests.java | 28 +- .../org/opensearch/ad/ADIntegTestCase.java | 49 +- .../ad/AbstractProfileRunnerTests.java | 8 +- .../ad/AnomalyDetectorJobRunnerTests.java | 148 +- .../ad/AnomalyDetectorProfileRunnerTests.java | 80 +- .../ad/AnomalyDetectorRestTestCase.java | 52 +- .../ad/EntityProfileRunnerTests.java | 83 +- .../ad/HistoricalAnalysisIntegTestCase.java | 61 +- .../ad/HistoricalAnalysisRestTestCase.java | 9 +- .../org/opensearch/ad/MemoryTrackerTests.java | 49 +- .../ad/MultiEntityProfileRunnerTests.java | 33 +- .../org/opensearch/ad/ODFERestTestCase.java | 6 +- .../breaker/ADCircuitBreakerServiceTests.java | 6 +- .../ad/breaker/MemoryCircuitBreakerTests.java | 2 + .../ad/bwc/ADBackwardsCompatibilityIT.java | 33 +- .../ad/caching/AbstractCacheTest.java | 18 +- .../ad/caching/CacheBufferTests.java | 4 +- .../ad/caching/PriorityCacheTests.java | 68 +- .../AnomalyDetectionNodeClientTests.java | 26 +- .../cluster/ADClusterEventListenerTests.java | 8 +- .../ad/cluster/ADDataMigratorTests.java | 72 +- .../ad/cluster/ADVersionUtilTests.java | 13 +- .../ClusterManagerEventListenerTests.java | 16 +- .../opensearch/ad/cluster/DailyCronTests.java | 6 +- .../opensearch/ad/cluster/HashRingTests.java | 29 +- .../ad/cluster/HourlyCronTests.java | 10 +- .../diskcleanup/IndexCleanupTests.java | 6 +- .../ModelCheckpointIndexRetentionTests.java | 22 +- .../ADTaskCancelledExceptionTests.java | 3 +- .../exception/ADValidationExceptionTests.java | 43 - .../LimitExceededExceptionTests.java | 3 +- .../NotSerializedADExceptionNameTests.java | 41 +- ...FeatureLinearUniformInterpolatorTests.java | 71 - .../ad/e2e/AbstractSyntheticDataTest.java | 9 +- .../ad/e2e/DetectionResultEvalutationIT.java | 13 +- .../ad/e2e/SingleStreamModelPerfIT.java | 4 +- .../ad/feature/FeatureManagerTests.java | 125 +- .../indices/AnomalyDetectionIndicesTests.java | 105 +- .../ad/indices/CustomIndexTests.java | 41 +- .../InitAnomalyDetectionIndicesTests.java | 81 +- .../opensearch/ad/indices/RolloverTests.java | 39 +- .../ad/indices/UpdateMappingTests.java | 30 +- .../ad/ml/AbstractCosineDataTest.java | 87 +- .../opensearch/ad/ml/CheckpointDaoTests.java | 77 +- .../ad/ml/CheckpointDeleteTests.java | 18 +- .../ad/ml/EntityColdStarterTests.java | 190 +- .../opensearch/ad/ml/HCADModelPerfTests.java | 69 +- .../opensearch/ad/ml/ModelManagerTests.java | 66 +- .../ad/ml/SingleStreamModelIdMapperTests.java | 1 + .../ad/ml/ThresholdingResultTests.java | 7 + .../ad/mock/plugin/MockReindexPlugin.java | 6 +- .../MockADCancelTaskNodeRequest_1_0.java | 2 +- .../MockAnomalyDetectorJobAction.java | 6 +- ...alyDetectorJobTransportActionWithUser.java | 39 +- .../MockForwardADTaskRequest_1_0.java | 10 +- .../ad/model/ADEntityTaskProfileTests.java | 7 +- .../org/opensearch/ad/model/ADTaskTests.java | 17 +- .../AnomalyDetectorExecutionInputTests.java | 2 +- .../ad/model/AnomalyDetectorJobTests.java | 15 +- .../AnomalyDetectorSerializationTests.java | 6 +- .../ad/model/AnomalyDetectorTests.java | 155 +- .../ad/model/AnomalyResultBucketTests.java | 6 +- .../ad/model/AnomalyResultTests.java | 10 +- .../ad/model/DetectionDateRangeTests.java | 27 +- .../ad/model/DetectorInternalStateTests.java | 2 +- .../ad/model/DetectorProfileTests.java | 29 +- .../ad/model/EntityAnomalyResultTests.java | 2 +- .../ad/model/EntityProfileTests.java | 14 +- .../org/opensearch/ad/model/EntityTests.java | 5 +- .../opensearch/ad/model/FeatureDataTests.java | 3 +- .../org/opensearch/ad/model/FeatureTests.java | 3 +- .../model/IntervalTimeConfigurationTests.java | 4 +- .../ad/model/MergeableListTests.java | 5 +- .../ad/model/ModelProfileTests.java | 7 +- .../ad/plugin/MockReindexPlugin.java | 6 +- .../ratelimit/AbstractRateLimitingTest.java | 16 +- ...CheckPointMaintainRequestAdapterTests.java | 8 +- .../CheckpointMaintainWorkerTests.java | 35 +- .../ratelimit/CheckpointReadWorkerTests.java | 158 +- .../ratelimit/CheckpointWriteWorkerTests.java | 87 +- .../ad/ratelimit/ColdEntityWorkerTests.java | 47 +- .../ratelimit/EntityColdStartWorkerTests.java | 23 +- .../ad/ratelimit/ResultWriteWorkerTests.java | 35 +- .../opensearch/ad/rest/ADRestTestUtils.java | 35 +- .../ad/rest/AnomalyDetectorRestApiIT.java | 391 ++-- .../ad/rest/HistoricalAnalysisRestApiIT.java | 57 +- .../opensearch/ad/rest/SecureADRestIT.java | 154 +- ...xAnomalyDetectorJobActionHandlerTests.java | 48 +- .../ad/settings/ADEnabledSettingTests.java | 75 + .../ad/settings/ADNumericSettingTests.java | 50 + .../AnomalyDetectorSettingsTests.java | 165 +- .../org/opensearch/ad/stats/ADStatsTests.java | 10 +- .../suppliers/ModelsOnNodeSupplierTests.java | 6 +- .../ad/task/ADTaskCacheManagerTests.java | 80 +- .../ad/task/ADTaskManagerTests.java | 203 +-- .../ADBatchAnomalyResultRequestTests.java | 2 +- ...atchAnomalyResultTransportActionTests.java | 63 +- .../ADCancelTaskNodeRequestTests.java | 2 +- .../ad/transport/ADCancelTaskTests.java | 8 +- .../ADResultBulkTransportActionTests.java | 10 +- .../ad/transport/ADStatsITTests.java | 6 +- .../ADStatsNodesTransportActionTests.java | 17 +- .../opensearch/ad/transport/ADStatsTests.java | 6 +- .../transport/ADTaskProfileResponseTests.java | 2 +- .../ad/transport/ADTaskProfileTests.java | 22 +- .../AnomalyDetectorJobActionTests.java | 22 +- ...nomalyDetectorJobTransportActionTests.java | 94 +- .../ad/transport/AnomalyResultTests.java | 117 +- .../AnomalyResultTransportActionTests.java | 20 +- .../transport/CronTransportActionTests.java | 17 +- .../DeleteAnomalyDetectorActionTests.java | 2 +- .../transport/DeleteAnomalyDetectorTests.java | 30 +- ...teAnomalyDetectorTransportActionTests.java | 4 +- ...eteAnomalyResultsTransportActionTests.java | 6 +- .../ad/transport/DeleteITTests.java | 6 +- .../DeleteModelTransportActionTests.java | 10 +- .../opensearch/ad/transport/DeleteTests.java | 16 +- .../ad/transport/EntityProfileTests.java | 26 +- .../EntityResultTransportActionTests.java | 68 +- .../transport/ForwardADTaskRequestTests.java | 29 +- .../ad/transport/ForwardADTaskTests.java | 10 +- .../ForwardADTaskTransportActionTests.java | 7 +- .../GetAnomalyDetectorActionTests.java | 24 +- .../GetAnomalyDetectorResponseTests.java | 6 +- .../ad/transport/GetAnomalyDetectorTests.java | 34 +- ...etAnomalyDetectorTransportActionTests.java | 47 +- .../IndexAnomalyDetectorActionTests.java | 4 +- ...exAnomalyDetectorTransportActionTests.java | 28 +- .../ad/transport/MultiEntityResultTests.java | 107 +- .../PreviewAnomalyDetectorActionTests.java | 4 +- ...ewAnomalyDetectorTransportActionTests.java | 64 +- .../ad/transport/ProfileITTests.java | 6 +- .../opensearch/ad/transport/ProfileTests.java | 17 +- .../ProfileTransportActionTests.java | 10 +- .../ad/transport/RCFPollingTests.java | 16 +- .../ad/transport/RCFResultITTests.java | 6 +- .../ad/transport/RCFResultTests.java | 24 +- .../transport/SearchADTasksActionTests.java | 8 +- .../SearchADTasksTransportActionTests.java | 4 +- .../SearchAnomalyDetectorActionTests.java | 5 +- .../SearchAnomalyDetectorInfoActionTests.java | 6 +- .../SearchAnomalyResultActionTests.java | 16 +- .../SearchTopAnomalyResultActionTests.java | 2 +- .../SearchTopAnomalyResultRequestTests.java | 6 +- .../SearchTopAnomalyResultResponseTests.java | 2 +- ...hTopAnomalyResultTransportActionTests.java | 6 +- ...tsAnomalyDetectorTransportActionTests.java | 4 +- .../ad/transport/ThresholdResultITTests.java | 6 +- .../ad/transport/ThresholdResultTests.java | 16 +- .../ValidateAnomalyDetectorRequestTests.java | 2 +- .../ValidateAnomalyDetectorResponseTests.java | 10 +- ...teAnomalyDetectorTransportActionTests.java | 102 +- .../handler/ADSearchHandlerTests.java | 12 +- .../handler/AbstractIndexHandlerTest.java | 34 +- .../AnomalyResultBulkIndexHandlerTests.java | 33 +- .../handler/AnomalyResultHandlerTests.java | 20 +- .../MultiEntityResultHandlerTests.java | 10 +- .../ad/util/ExceptionUtilsTests.java | 9 +- .../opensearch/ad/util/IndexUtilsTests.java | 32 +- .../opensearch/ad/util/ParseUtilsTests.java | 27 +- .../ad/util/RestHandlerUtilsTests.java | 21 +- .../opensearch/ad/util/ThrottlerTests.java | 67 - .../ad/util/ThrowingSupplierWrapperTests.java | 1 + .../indices/ForecastIndexManagementTests.java | 338 ++++ .../indices/ForecastIndexMappingTests.java | 87 + .../indices/ForecastResultIndexTests.java | 229 +++ .../forecast/model/ForecastResultTests.java | 103 ++ .../model/ForecastSerializationTests.java | 85 + .../model/ForecastTaskSerializationTests.java | 121 ++ .../forecast/model/ForecastTaskTests.java | 38 + .../forecast/model/ForecastTaskTypeTests.java | 53 + .../forecast/model/ForecasterTests.java | 396 ++++ .../settings/ForecastEnabledSettingTests.java | 30 + .../settings/ForecastNumericSettingTests.java | 50 + .../metrics/CardinalityProfileTests.java | 48 +- .../AbstractTimeSeriesTest.java} | 18 +- .../timeseries/DataByFeatureIdTests.java | 81 + .../NodeStateManagerTests.java | 215 ++- .../{ad => timeseries}/NodeStateTests.java | 20 +- .../{ad => timeseries}/TestHelpers.java | 481 ++++- .../TimeSeriesPluginTests.java} | 13 +- .../exception/ValidationExceptionTests.java | 51 + .../dataprocessor/FixedValueImputerTests.java | 34 + .../dataprocessor/ImputationOptionTests.java | 124 ++ ...gerSensitiveLinearUniformImputerTests.java | 72 + ...ultiFeatureLinearUniformImputerTests.java} | 15 +- .../PreviousValueImputerTests.java | 27 + ...ngleFeatureLinearUniformImputerTests.java} | 21 +- .../dataprocessor/ZeroImputerTests.java | 39 + .../NoPowermockSearchFeatureDaoTests.java | 64 +- .../feature/SearchFeatureDaoParamTests.java | 45 +- .../feature/SearchFeatureDaoTests.java | 43 +- .../indices/IndexManagementIntegTestCase.java | 104 ++ .../timeseries/util/ClientUtilTests.java | 153 ++ .../timeseries/util/LTrimTests.java | 43 + ...iResponsesDelegateActionListenerTests.java | 4 +- .../test/org/opensearch/ad/util/FakeNode.java | 3 +- .../test/org/opensearch/ad/util/MLUtil.java | 20 +- .../ad/util/RandomModelStateConfig.java | 2 +- src/test/resources/security/sample.pem | 2 +- 500 files changed, 16802 insertions(+), 11652 deletions(-) create mode 100644 gradle.properties create mode 100644 release-notes/opensearch-anomaly-detection.release-notes-2.0.0.0-rc1.md delete mode 100644 src/main/java/org/opensearch/ad/common/exception/ClientException.java delete mode 100644 src/main/java/org/opensearch/ad/common/exception/InternalFailure.java create mode 100644 src/main/java/org/opensearch/ad/constant/ADCommonMessages.java rename src/main/java/org/opensearch/ad/constant/{CommonName.java => ADCommonName.java} (69%) delete mode 100644 src/main/java/org/opensearch/ad/constant/CommonErrorMessages.java delete mode 100644 src/main/java/org/opensearch/ad/dataprocessor/IntegerSensitiveSingleFeatureLinearUniformInterpolator.java delete mode 100644 src/main/java/org/opensearch/ad/dataprocessor/Interpolator.java delete mode 100644 src/main/java/org/opensearch/ad/dataprocessor/LinearUniformInterpolator.java delete mode 100644 src/main/java/org/opensearch/ad/dataprocessor/SingleFeatureLinearUniformInterpolator.java create mode 100644 src/main/java/org/opensearch/ad/indices/ADIndexManagement.java delete mode 100644 src/main/java/org/opensearch/ad/model/DetectorValidationIssueType.java rename src/main/java/org/opensearch/ad/settings/{EnabledSetting.java => ADEnabledSetting.java} (68%) rename src/main/java/org/opensearch/ad/settings/{NumericSetting.java => ADNumericSetting.java} (75%) delete mode 100644 src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobResponse.java delete mode 100644 src/main/java/org/opensearch/ad/util/Bwc.java delete mode 100644 src/main/java/org/opensearch/ad/util/ClientUtil.java delete mode 100644 src/main/java/org/opensearch/ad/util/Throttler.java create mode 100644 src/main/java/org/opensearch/forecast/constant/ForecastCommonMessages.java create mode 100644 src/main/java/org/opensearch/forecast/constant/ForecastCommonName.java create mode 100644 src/main/java/org/opensearch/forecast/constant/ForecastCommonValue.java create mode 100644 src/main/java/org/opensearch/forecast/indices/ForecastIndex.java create mode 100644 src/main/java/org/opensearch/forecast/indices/ForecastIndexManagement.java create mode 100644 src/main/java/org/opensearch/forecast/model/ForecastResult.java create mode 100644 src/main/java/org/opensearch/forecast/model/ForecastTask.java create mode 100644 src/main/java/org/opensearch/forecast/model/ForecastTaskType.java create mode 100644 src/main/java/org/opensearch/forecast/model/Forecaster.java create mode 100644 src/main/java/org/opensearch/forecast/settings/ForecastEnabledSetting.java create mode 100644 src/main/java/org/opensearch/forecast/settings/ForecastNumericSetting.java create mode 100644 src/main/java/org/opensearch/forecast/settings/ForecastSettings.java create mode 100644 src/main/java/org/opensearch/timeseries/AnalysisType.java rename src/main/java/org/opensearch/{ad => timeseries}/CleanState.java (94%) create mode 100644 src/main/java/org/opensearch/timeseries/ExceptionRecorder.java rename src/main/java/org/opensearch/{ad => timeseries}/ExpiringState.java (94%) rename src/main/java/org/opensearch/{ad => timeseries}/MaintenanceState.java (96%) rename src/main/java/org/opensearch/{ad => timeseries}/MemoryTracker.java (83%) rename src/main/java/org/opensearch/{ad => timeseries}/Name.java (95%) rename src/main/java/org/opensearch/{ad => timeseries}/NodeState.java (56%) rename src/main/java/org/opensearch/{ad => timeseries}/NodeStateManager.java (56%) rename src/main/java/org/opensearch/{ad/AnomalyDetectorPlugin.java => timeseries/TimeSeriesAnalyticsPlugin.java} (75%) rename src/main/java/org/opensearch/{ad => timeseries}/annotation/Generated.java (94%) rename src/main/java/org/opensearch/{ad => timeseries}/breaker/BreakerName.java (92%) rename src/main/java/org/opensearch/{ad => timeseries}/breaker/CircuitBreaker.java (91%) rename src/main/java/org/opensearch/{ad/breaker/ADCircuitBreakerService.java => timeseries/breaker/CircuitBreakerService.java} (80%) rename src/main/java/org/opensearch/{ad => timeseries}/breaker/MemoryCircuitBreaker.java (95%) rename src/main/java/org/opensearch/{ad => timeseries}/breaker/ThresholdCircuitBreaker.java (94%) create mode 100644 src/main/java/org/opensearch/timeseries/common/exception/ClientException.java rename src/main/java/org/opensearch/{ad => timeseries}/common/exception/DuplicateTaskException.java (77%) rename src/main/java/org/opensearch/{ad => timeseries}/common/exception/EndRunException.java (75%) create mode 100644 src/main/java/org/opensearch/timeseries/common/exception/InternalFailure.java rename src/main/java/org/opensearch/{ad => timeseries}/common/exception/LimitExceededException.java (61%) rename src/main/java/org/opensearch/{ad/common/exception/NotSerializedADExceptionName.java => timeseries/common/exception/NotSerializedExceptionName.java} (66%) rename src/main/java/org/opensearch/{ad => timeseries}/common/exception/ResourceNotFoundException.java (62%) rename src/main/java/org/opensearch/{ad/common/exception/ADTaskCancelledException.java => timeseries/common/exception/TaskCancelledException.java} (73%) rename src/main/java/org/opensearch/{ad/common/exception/AnomalyDetectionException.java => timeseries/common/exception/TimeSeriesException.java} (55%) rename src/main/java/org/opensearch/{ad/common/exception/ADValidationException.java => timeseries/common/exception/ValidationException.java} (69%) rename src/main/java/org/opensearch/{ad/common/exception/ADVersionException.java => timeseries/common/exception/VersionException.java} (57%) create mode 100644 src/main/java/org/opensearch/timeseries/constant/CommonMessages.java create mode 100644 src/main/java/org/opensearch/timeseries/constant/CommonName.java create mode 100644 src/main/java/org/opensearch/timeseries/constant/CommonValue.java create mode 100644 src/main/java/org/opensearch/timeseries/dataprocessor/FixedValueImputer.java create mode 100644 src/main/java/org/opensearch/timeseries/dataprocessor/ImputationMethod.java create mode 100644 src/main/java/org/opensearch/timeseries/dataprocessor/ImputationOption.java create mode 100644 src/main/java/org/opensearch/timeseries/dataprocessor/Imputer.java create mode 100644 src/main/java/org/opensearch/timeseries/dataprocessor/LinearUniformImputer.java create mode 100644 src/main/java/org/opensearch/timeseries/dataprocessor/PreviousValueImputer.java create mode 100644 src/main/java/org/opensearch/timeseries/dataprocessor/ZeroImputer.java rename src/main/java/org/opensearch/{ad => timeseries}/feature/SearchFeatureDao.java (90%) create mode 100644 src/main/java/org/opensearch/timeseries/function/BiCheckedFunction.java rename src/main/java/org/opensearch/{ad/rest/handler/AnomalyDetectorFunction.java => timeseries/function/ExecutorFunction.java} (85%) rename src/main/java/org/opensearch/{ad/util => timeseries/function}/ThrowingConsumer.java (92%) rename src/main/java/org/opensearch/{ad/util => timeseries/function}/ThrowingSupplier.java (92%) rename src/main/java/org/opensearch/{ad/util => timeseries/function}/ThrowingSupplierWrapper.java (96%) rename src/main/java/org/opensearch/{ad/indices/AnomalyDetectionIndices.java => timeseries/indices/IndexManagement.java} (58%) create mode 100644 src/main/java/org/opensearch/timeseries/indices/TimeSeriesIndex.java create mode 100644 src/main/java/org/opensearch/timeseries/ml/IntermediateResult.java rename src/main/java/org/opensearch/{ad => timeseries}/ml/SingleStreamModelIdMapper.java (98%) create mode 100644 src/main/java/org/opensearch/timeseries/model/Config.java rename src/main/java/org/opensearch/{ad => timeseries}/model/DataByFeatureId.java (90%) rename src/main/java/org/opensearch/{ad/model/DetectionDateRange.java => timeseries/model/DateRange.java} (87%) rename src/main/java/org/opensearch/{ad => timeseries}/model/Entity.java (79%) rename src/main/java/org/opensearch/{ad => timeseries}/model/Feature.java (96%) rename src/main/java/org/opensearch/{ad => timeseries}/model/FeatureData.java (97%) create mode 100644 src/main/java/org/opensearch/timeseries/model/IndexableResult.java rename src/main/java/org/opensearch/{ad => timeseries}/model/IntervalTimeConfiguration.java (86%) rename src/main/java/org/opensearch/{ad/model/AnomalyDetectorJob.java => timeseries/model/Job.java} (90%) rename src/main/java/org/opensearch/{ad => timeseries}/model/MergeableList.java (91%) rename src/main/java/org/opensearch/{ad/model/ADTaskState.java => timeseries/model/TaskState.java} (68%) create mode 100644 src/main/java/org/opensearch/timeseries/model/TaskType.java rename src/main/java/org/opensearch/{ad => timeseries}/model/TimeConfiguration.java (98%) create mode 100644 src/main/java/org/opensearch/timeseries/model/TimeSeriesTask.java rename src/main/java/org/opensearch/{ad => timeseries}/model/ValidationAspect.java (75%) create mode 100644 src/main/java/org/opensearch/timeseries/model/ValidationIssueType.java rename src/main/java/org/opensearch/{ad/settings/AbstractSetting.java => timeseries/settings/DynamicNumericSetting.java} (84%) create mode 100644 src/main/java/org/opensearch/timeseries/settings/TimeSeriesSettings.java rename src/main/java/org/opensearch/{ad => timeseries}/stats/StatNames.java (98%) rename src/main/java/org/opensearch/{ad/task/ADRealtimeTaskCache.java => timeseries/task/RealtimeTaskCache.java} (89%) create mode 100644 src/main/java/org/opensearch/timeseries/task/TaskCacheManager.java rename src/main/java/org/opensearch/{ad => timeseries}/transport/BackPressureRouting.java (98%) create mode 100644 src/main/java/org/opensearch/timeseries/transport/JobResponse.java create mode 100644 src/main/java/org/opensearch/timeseries/util/ClientUtil.java create mode 100644 src/main/java/org/opensearch/timeseries/util/DataUtil.java rename src/main/java/org/opensearch/{ad => timeseries}/util/DiscoveryNodeFilterer.java (92%) rename src/main/java/org/opensearch/{ad => timeseries}/util/ExceptionUtil.java (95%) rename src/main/java/org/opensearch/{ad => timeseries}/util/MultiResponsesDelegateActionListener.java (98%) rename src/main/java/org/opensearch/{ad => timeseries}/util/ParseUtils.java (80%) rename src/main/java/org/opensearch/{ad => timeseries}/util/RestHandlerUtils.java (87%) create mode 100644 src/main/java/org/opensearch/timeseries/util/SafeSecurityInjector.java rename src/main/java/org/opensearch/{ad => timeseries}/util/SecurityClientUtil.java (82%) rename src/main/java/org/opensearch/{ad => timeseries}/util/SecurityUtil.java (86%) rename src/main/java/org/opensearch/{ad/util/ADSafeSecurityInjector.java => timeseries/util/TimeSeriesSafeSecurityInjector.java} (52%) rename src/main/resources/mappings/{checkpoint.json => anomaly-checkpoint.json} (100%) rename src/main/resources/mappings/{anomaly-detectors.json => config.json} (100%) create mode 100644 src/main/resources/mappings/forecast-checkpoint.json create mode 100644 src/main/resources/mappings/forecast-results.json create mode 100644 src/main/resources/mappings/forecast-state.json rename src/main/resources/mappings/{anomaly-detector-jobs.json => job.json} (100%) delete mode 100644 src/test/java/org/opensearch/BwcTests.java delete mode 100644 src/test/java/org/opensearch/EntityProfileRequest1_0.java delete mode 100644 src/test/java/org/opensearch/EntityProfileResponse1_0.java delete mode 100644 src/test/java/org/opensearch/EntityResultRequest1_0.java delete mode 100644 src/test/java/org/opensearch/ModelProfile1_0.java delete mode 100644 src/test/java/org/opensearch/ProfileNodeResponse1_0.java delete mode 100644 src/test/java/org/opensearch/ProfileResponse1_0.java delete mode 100644 src/test/java/org/opensearch/RCFResultResponse1_0.java create mode 100644 src/test/java/org/opensearch/StreamInputOutputTests.java delete mode 100644 src/test/java/org/opensearch/ad/common/exception/ADValidationExceptionTests.java delete mode 100644 src/test/java/org/opensearch/ad/dataprocessor/SingleFeatureLinearUniformInterpolatorTests.java create mode 100644 src/test/java/org/opensearch/ad/settings/ADEnabledSettingTests.java create mode 100644 src/test/java/org/opensearch/ad/settings/ADNumericSettingTests.java delete mode 100644 src/test/java/org/opensearch/ad/util/ThrottlerTests.java create mode 100644 src/test/java/org/opensearch/forecast/indices/ForecastIndexManagementTests.java create mode 100644 src/test/java/org/opensearch/forecast/indices/ForecastIndexMappingTests.java create mode 100644 src/test/java/org/opensearch/forecast/indices/ForecastResultIndexTests.java create mode 100644 src/test/java/org/opensearch/forecast/model/ForecastResultTests.java create mode 100644 src/test/java/org/opensearch/forecast/model/ForecastSerializationTests.java create mode 100644 src/test/java/org/opensearch/forecast/model/ForecastTaskSerializationTests.java create mode 100644 src/test/java/org/opensearch/forecast/model/ForecastTaskTests.java create mode 100644 src/test/java/org/opensearch/forecast/model/ForecastTaskTypeTests.java create mode 100644 src/test/java/org/opensearch/forecast/model/ForecasterTests.java create mode 100644 src/test/java/org/opensearch/forecast/settings/ForecastEnabledSettingTests.java create mode 100644 src/test/java/org/opensearch/forecast/settings/ForecastNumericSettingTests.java rename src/test/java/org/opensearch/{ad/AbstractADTest.java => timeseries/AbstractTimeSeriesTest.java} (96%) create mode 100644 src/test/java/org/opensearch/timeseries/DataByFeatureIdTests.java rename src/test/java/org/opensearch/{ad => timeseries}/NodeStateManagerTests.java (66%) rename src/test/java/org/opensearch/{ad => timeseries}/NodeStateTests.java (82%) rename src/test/java/org/opensearch/{ad => timeseries}/TestHelpers.java (78%) rename src/test/java/org/opensearch/{ad/AnomalyDetectorPluginTests.java => timeseries/TimeSeriesPluginTests.java} (87%) create mode 100644 src/test/java/org/opensearch/timeseries/common/exception/ValidationExceptionTests.java create mode 100644 src/test/java/org/opensearch/timeseries/dataprocessor/FixedValueImputerTests.java create mode 100644 src/test/java/org/opensearch/timeseries/dataprocessor/ImputationOptionTests.java create mode 100644 src/test/java/org/opensearch/timeseries/dataprocessor/IntegerSensitiveLinearUniformImputerTests.java rename src/test/java/org/opensearch/{ad/dataprocessor/LinearUniformInterpolatorTests.java => timeseries/dataprocessor/MultiFeatureLinearUniformImputerTests.java} (81%) create mode 100644 src/test/java/org/opensearch/timeseries/dataprocessor/PreviousValueImputerTests.java rename src/test/java/org/opensearch/{ad/dataprocessor/IntegerSensitiveSingleFeatureLinearUniformInterpolatorTests.java => timeseries/dataprocessor/SingleFeatureLinearUniformImputerTests.java} (54%) create mode 100644 src/test/java/org/opensearch/timeseries/dataprocessor/ZeroImputerTests.java rename src/test/java/org/opensearch/{ad => timeseries}/feature/NoPowermockSearchFeatureDaoTests.java (93%) rename src/test/java/org/opensearch/{ad => timeseries}/feature/SearchFeatureDaoParamTests.java (90%) rename src/test/java/org/opensearch/{ad => timeseries}/feature/SearchFeatureDaoTests.java (91%) create mode 100644 src/test/java/org/opensearch/timeseries/indices/IndexManagementIntegTestCase.java create mode 100644 src/test/java/org/opensearch/timeseries/util/ClientUtilTests.java create mode 100644 src/test/java/org/opensearch/timeseries/util/LTrimTests.java rename src/test/java/org/opensearch/{ad => timeseries}/util/MultiResponsesDelegateActionListenerTests.java (96%) diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS index cd025f06f..aff582980 100644 --- a/.github/CODEOWNERS +++ b/.github/CODEOWNERS @@ -1,2 +1,2 @@ -# This should match the owning team set up in https://github.com/orgs/opensearch-project/teams -* @opensearch-project/anomaly-detection \ No newline at end of file +* @jmazanec15 @jngz-es @kaituo @saratvemulapalli @ohltyler @vamshin @VijayanB @ylwu-amzn @amitgalitz @sudiptoguha @jackiehanyang @sean-zheng-amazon @dbwiddis @owaiskazi19 @joshpalis + diff --git a/.github/labeler.yml b/.github/labeler.yml index 53dc0443c..9f6f50c54 100644 --- a/.github/labeler.yml +++ b/.github/labeler.yml @@ -1,4 +1,4 @@ -backport 1.x: +backport 2.x: - "*" - "*/*" - "*/**/*" diff --git a/.github/workflows/auto-release.yml b/.github/workflows/auto-release.yml index 8fb1a6190..2916ab207 100644 --- a/.github/workflows/auto-release.yml +++ b/.github/workflows/auto-release.yml @@ -13,7 +13,7 @@ jobs: steps: - name: GitHub App token id: github_app_token - uses: tibdex/github-app-token@v2.0.0 + uses: tibdex/github-app-token@v2.1.0 with: app_id: ${{ secrets.APP_ID }} private_key: ${{ secrets.APP_PRIVATE_KEY }} diff --git a/.github/workflows/backport.yml b/.github/workflows/backport.yml index a34e8d699..5ed1dcdce 100644 --- a/.github/workflows/backport.yml +++ b/.github/workflows/backport.yml @@ -15,7 +15,7 @@ jobs: steps: - name: GitHub App token id: github_app_token - uses: tibdex/github-app-token@v2.0.0 + uses: tibdex/github-app-token@v2.1.0 with: app_id: ${{ secrets.APP_ID }} private_key: ${{ secrets.APP_PRIVATE_KEY }} diff --git a/.github/workflows/labeler.yml b/.github/workflows/labeler.yml index 3e11da394..910df1d3b 100644 --- a/.github/workflows/labeler.yml +++ b/.github/workflows/labeler.yml @@ -15,7 +15,7 @@ jobs: steps: - name: GitHub App token id: github_app_token - uses: tibdex/github-app-token@v2.0.0 + uses: tibdex/github-app-token@v2.1.0 with: app_id: ${{ secrets.APP_ID }} private_key: ${{ secrets.APP_PRIVATE_KEY }} diff --git a/.github/workflows/test_build_multi_platform.yml b/.github/workflows/test_build_multi_platform.yml index ebb7185e9..7456e70a6 100644 --- a/.github/workflows/test_build_multi_platform.yml +++ b/.github/workflows/test_build_multi_platform.yml @@ -127,7 +127,7 @@ jobs: ./gradlew assemble - name: Build and Run Tests run: | - ./gradlew build -Dtest.logs=true + ./gradlew build - name: Publish to Maven Local run: | ./gradlew publishToMavenLocal diff --git a/.github/workflows/test_security.yml b/.github/workflows/test_security.yml index 5aaaa6a08..7254a9b43 100644 --- a/.github/workflows/test_security.yml +++ b/.github/workflows/test_security.yml @@ -16,8 +16,6 @@ jobs: name: Security test workflow for Anomaly Detection runs-on: ubuntu-latest - env: - JENKINS_URL: build.ci.opensearch.org steps: - name: Setup Java ${{ matrix.java }} @@ -26,7 +24,7 @@ jobs: distribution: 'temurin' java-version: ${{ matrix.java }} - # anomaly-detection + # time-series-analytics - name: Checkout AD uses: actions/checkout@v4 @@ -34,14 +32,14 @@ jobs: run: | ./gradlew assemble # example of variables: - # plugin = opensearch-anomaly-detection-2.4.0.0-SNAPSHOT.zip - # version = 2.4.0, plugin_version = 2.4.0.0, qualifier = SNAPSHOT + # plugin = opensearch-time-series-analytics-2.10.0.0-SNAPSHOT.zip + # version = 2.10.0, plugin_version = 2.10.0.0, qualifier = SNAPSHOT - name: Pull and Run Docker run: | plugin=`basename $(ls build/distributions/*.zip)` - version=`echo $plugin|awk -F- '{print $4}'| cut -d. -f 1-3` - plugin_version=`echo $plugin|awk -F- '{print $4}'| cut -d. -f 1-4` - qualifier=`echo $plugin|awk -F- '{print $5}'| cut -d. -f 1-1` + version=`echo $plugin|awk -F- '{print $5}'| cut -d. -f 1-3` + plugin_version=`echo $plugin|awk -F- '{print $5}'| cut -d. -f 1-4` + qualifier=`echo $plugin|awk -F- '{print $6}'| cut -d. -f 1-1` if $qualifier!=SNAPSHOT then @@ -57,6 +55,7 @@ jobs: then echo "FROM opensearchstaging/opensearch:$docker_version" >> Dockerfile echo "RUN if [ -d /usr/share/opensearch/plugins/opensearch-anomaly-detection ]; then /usr/share/opensearch/bin/opensearch-plugin remove opensearch-anomaly-detection; fi" >> Dockerfile + echo "RUN if [ -d /usr/share/opensearch/plugins/opensearch-time-series-analytics ]; then /usr/share/opensearch/bin/opensearch-plugin remove opensearch-time-series-analytics; fi" >> Dockerfile echo "ADD anomaly-detection/build/distributions/$plugin /tmp/" >> Dockerfile echo "RUN /usr/share/opensearch/bin/opensearch-plugin install --batch file:/tmp/$plugin" >> Dockerfile docker build -t opensearch-ad:test . diff --git a/MAINTAINERS.md b/MAINTAINERS.md index a1abcfd83..fabf00e7a 100644 --- a/MAINTAINERS.md +++ b/MAINTAINERS.md @@ -4,16 +4,28 @@ This document contains a list of maintainers in this repo. See [opensearch-proje ## Current Maintainers -| Maintainer | GitHub ID | Affiliation | -| ----------------------- | ------------------------------------------------------- | ----------- | -| Hanguang Zhang | [zhanghg08](https://github.com/zhanghg08) | Amazon | -| Jack Mazanec | [jmazanec15](https://github.com/jmazanec15) | Amazon | -| Jing Zhang | [jngz-es](https://github.com/jngz-es) | Amazon | -| Kaituo Li | [kaituo](https://github.com/kaituo) | Amazon | -| Lai Jiang | [wnbts](https://github.com/wnbts) | Amazon | -| Sarat Vemulapalli | [saratvemulapalli](https://github.com/saratvemulapalli) | Amazon | -| Tyler Ohlsen | [ohltyler](https://github.com/ohltyler) | Amazon | -| Vamshi Vijay Nakkirtha | [vamshin](https://github.com/vamshin) | Amazon | -| Vijayan Balasubramanian | [VijayanB](https://github.com/VijayanB) | Amazon | -| Yaliang Wu | [ylwu-amzn](https://github.com/ylwu-amzn) | Amazon | -| Yizhe Liu | [yizheliu-amazon](https://github.com/yizheliu-amazon) | Amazon | +| Maintainer | GitHub ID | Affiliation | +| ----------------------- | ---------------------------------------------------------| ----------- | +| Jack Mazanec | [jmazanec15](https://github.com/jmazanec15) | Amazon | +| Jing Zhang | [jngz-es](https://github.com/jngz-es) | Amazon | +| Kaituo Li | [kaituo](https://github.com/kaituo) | Amazon | +| Sarat Vemulapalli | [saratvemulapalli](https://github.com/saratvemulapalli) | Amazon | +| Tyler Ohlsen | [ohltyler](https://github.com/ohltyler) | Amazon | +| Vamshi Vijay Nakkirtha | [vamshin](https://github.com/vamshin) | Amazon | +| Vijayan Balasubramanian | [VijayanB](https://github.com/VijayanB) | Amazon | +| Yaliang Wu | [ylwu-amzn](https://github.com/ylwu-amzn) | Amazon | +| Amit Galitzky | [amitgalitz](https://github.com/amitgalitz) | Amazon | +| Sudipto Guha | [sudiptoguha](https://github.com/sudiptoguha) | Amazon | +| Jackie Han | [jackiehanyang](https://github.com/jackiehanyang) | Amazon | +| Sean Zheng | [sean-zheng-amazon](https://github.com/sean-zheng-amazon)| Amazon | +| Dan Widdis | [dbwiddis](https://github.com/dbwiddis) | Amazon | +| Owais Kazi | [owaiskazi19](https://github.com/owaiskazi19) | Amazon | +| Josh Palis | [joshpalis](https://github.com/joshpalis) | Amazon | + +## Emeritus + +| Maintainer | GitHub ID | Affiliation | +| -------------- | ----------------------------------------------------- | ----------- | +| Hanguang Zhang | [zhanghg08](https://github.com/zhanghg08) | Amazon | +| Yizhe Liu | [yizheliu-amazon](https://github.com/yizheliu-amazon) | Amazon | +| Lai Jiang | [wnbts](https://github.com/wnbts) | Amazon | \ No newline at end of file diff --git a/build.gradle b/build.gradle index 9e2130321..33ddf7e14 100644 --- a/build.gradle +++ b/build.gradle @@ -20,7 +20,7 @@ buildscript { isSnapshot = "true" == System.getProperty("build.snapshot", "true") opensearch_version = System.getProperty("opensearch.version", "2.13.0-SNAPSHOT") buildVersionQualifier = System.getProperty("build.version_qualifier", "") - // 2.3.0-SNAPSHOT -> 2.3.0.0-SNAPSHOT + // 3.0.0-SNAPSHOT -> 3.0.0.0-SNAPSHOT version_tokens = opensearch_version.tokenize('-') opensearch_build = version_tokens[0] + '.0' plugin_no_snapshot = opensearch_build @@ -35,16 +35,20 @@ buildscript { js_resource_folder = "src/test/resources/job-scheduler" common_utils_version = System.getProperty("common_utils.version", opensearch_build) job_scheduler_version = System.getProperty("job_scheduler.version", opensearch_build) - job_scheduler_build_download = 'https://ci.opensearch.org/ci/dbc/distribution-build-opensearch/' + opensearch_no_snapshot + - '/latest/linux/x64/tar/builds/opensearch/plugins/opensearch-job-scheduler-' + plugin_no_snapshot + '.zip' - anomaly_detection_build_download = 'https://ci.opensearch.org/ci/dbc/distribution-build-opensearch/' + opensearch_no_snapshot + - '/latest/linux/x64/tar/builds/opensearch/plugins/opensearch-anomaly-detection-' + plugin_no_snapshot + '.zip' - bwcOpenSearchADDownload = 'https://ci.opensearch.org/ci/dbc/distribution-build-opensearch/1.3.2/latest/linux/x64/tar/builds/opensearch/plugins/opensearch-anomaly-detection-1.3.2.0.zip' - bwcOpenSearchJSDownload = 'https://ci.opensearch.org/ci/dbc/distribution-build-opensearch/1.3.2/latest/linux/x64/tar/builds/opensearch/plugins/opensearch-job-scheduler-1.3.2.0.zip' + bwcVersionShort = "2.10.0" + bwcVersion = bwcVersionShort + ".0" + bwcOpenSearchADDownload = 'https://ci.opensearch.org/ci/dbc/distribution-build-opensearch/' + bwcVersionShort + '/latest/linux/x64/tar/builds/' + + 'opensearch/plugins/opensearch-anomaly-detection-' + bwcVersion + '.zip' + bwcOpenSearchJSDownload = 'https://ci.opensearch.org/ci/dbc/distribution-build-opensearch/' + bwcVersionShort + '/latest/linux/x64/tar/builds/' + + 'opensearch/plugins/opensearch-job-scheduler-' + bwcVersion + '.zip' + baseName = "adBwcCluster" + bwcFilePath = "src/test/resources/org/opensearch/ad/bwc/" + bwcJobSchedulerPath = bwcFilePath + "job-scheduler/" + bwcAnomalyDetectionPath = bwcFilePath + "anomaly-detection/" // gradle build won't print logs during test by default unless there is a failure. // It is useful to record intermediately information like prediction precision and recall. - // This option turn on log printing during tests. + // This option turn on log printing during tests. printLogs = "true" == System.getProperty("test.logs", "false") } @@ -61,7 +65,7 @@ buildscript { } plugins { - id 'com.netflix.nebula.ospackage' version "11.6.0" + id 'com.netflix.nebula.ospackage' version "11.5.0" id "com.diffplug.spotless" version "6.24.0" id 'java-library' id 'org.gradle.test-retry' version '1.5.7' @@ -98,6 +102,67 @@ repositories { maven { url "https://plugins.gradle.org/m2/" } maven { url "https://ci.opensearch.org/ci/dbc/snapshots/lucene/" } } +configurations { + zipArchive + //hamcrest-core needs to be ignored since it causes jar hell exception due to a conflict during testing + testImplementation { + exclude group: 'org.hamcrest', module: 'hamcrest-core' + } +} + +dependencies { + zipArchive group: 'org.opensearch.plugin', name:'opensearch-job-scheduler', version: "${opensearch_build}" + implementation "org.opensearch:opensearch:${opensearch_version}" + compileOnly "org.opensearch.plugin:opensearch-scripting-painless-spi:${opensearch_version}" + compileOnly "org.opensearch:opensearch-job-scheduler-spi:${job_scheduler_version}" + implementation "org.opensearch:common-utils:${common_utils_version}" + implementation "org.opensearch.client:opensearch-rest-client:${opensearch_version}" + compileOnly group: 'com.google.guava', name: 'guava', version:'32.1.3-jre' + compileOnly group: 'com.google.guava', name: 'failureaccess', version:'1.0.1' + implementation group: 'org.javassist', name: 'javassist', version:'3.28.0-GA' + implementation group: 'org.apache.commons', name: 'commons-math3', version: '3.6.1' + implementation group: 'com.google.code.gson', name: 'gson', version: '2.8.9' + implementation group: 'com.yahoo.datasketches', name: 'sketches-core', version: '0.13.4' + implementation group: 'com.yahoo.datasketches', name: 'memory', version: '0.12.2' + implementation group: 'commons-lang', name: 'commons-lang', version: '2.6' + implementation group: 'org.apache.commons', name: 'commons-pool2', version: '2.11.1' + implementation 'software.amazon.randomcutforest:randomcutforest-serialization:3.8.0' + implementation 'software.amazon.randomcutforest:randomcutforest-parkservices:3.8.0' + implementation 'software.amazon.randomcutforest:randomcutforest-core:3.8.0' + + // we inherit jackson-core from opensearch core + implementation "com.fasterxml.jackson.core:jackson-databind:2.16.1" + implementation "com.fasterxml.jackson.core:jackson-annotations:2.16.1" + + // used for serializing/deserializing rcf models. + implementation group: 'io.protostuff', name: 'protostuff-core', version: '1.8.0' + implementation group: 'io.protostuff', name: 'protostuff-runtime', version: '1.8.0' + implementation group: 'io.protostuff', name: 'protostuff-api', version: '1.8.0' + implementation group: 'io.protostuff', name: 'protostuff-collectionschema', version: '1.8.0' + implementation group: 'org.apache.commons', name: 'commons-lang3', version: '3.13.0' + + + implementation "org.jacoco:org.jacoco.agent:0.8.11" + implementation ("org.jacoco:org.jacoco.ant:0.8.11") { + exclude group: 'org.ow2.asm', module: 'asm-commons' + exclude group: 'org.ow2.asm', module: 'asm' + exclude group: 'org.ow2.asm', module: 'asm-tree' + } + + testImplementation group: 'pl.pragmatists', name: 'JUnitParams', version: '1.1.1' + testImplementation group: 'org.mockito', name: 'mockito-core', version: '5.9.0' + testImplementation group: 'org.objenesis', name: 'objenesis', version: '3.3' + testImplementation group: 'net.bytebuddy', name: 'byte-buddy', version: '1.14.9' + testImplementation group: 'net.bytebuddy', name: 'byte-buddy-agent', version: '1.14.9' + testCompileOnly 'org.apiguardian:apiguardian-api:1.1.2' + // jupiter is required to run unit tests not inherited from OpenSearchTestCase (e.g., PreviousValueImputerTests) + testImplementation 'org.junit.jupiter:junit-jupiter-api:5.10.0' + testImplementation 'org.junit.jupiter:junit-jupiter-params:5.10.0' + testImplementation 'org.junit.jupiter:junit-jupiter-engine:5.10.0' + testImplementation "org.opensearch:opensearch-core:${opensearch_version}" + testRuntimeOnly("org.junit.platform:junit-platform-launcher:1.10.0") + testCompileOnly 'junit:junit:4.13.2' +} apply plugin: 'java' apply plugin: 'idea' @@ -135,9 +200,9 @@ ext { } opensearchplugin { - name 'opensearch-anomaly-detection' - description 'OpenSearch anomaly detector plugin' - classname 'org.opensearch.ad.AnomalyDetectorPlugin' + name 'opensearch-time-series-analytics' + description 'OpenSearch time series analytics plugin' + classname 'org.opensearch.timeseries.TimeSeriesAnalyticsPlugin' extendedPlugins = ['lang-painless', 'opensearch-job-scheduler'] } @@ -147,7 +212,8 @@ configurations.all { resolutionStrategy { force "joda-time:joda-time:${versions.joda}" force "commons-logging:commons-logging:${versions.commonslogging}" - force "org.apache.httpcomponents:httpcore:${versions.httpcore}" + force "org.apache.httpcomponents.core5:httpcore5:${versions.httpcore5}" + force "org.apache.httpcomponents.client5:httpclient5:${versions.httpclient5}" force "commons-codec:commons-codec:${versions.commonscodec}" force "org.mockito:mockito-core:5.9.0" @@ -163,11 +229,6 @@ configurations.all { } } -configurations { - testImplementation { - exclude group: 'org.hamcrest', module: 'hamcrest-core' - } -} publishing { publications { @@ -191,7 +252,7 @@ publishing { } } } - + repositories { maven { name = "Snapshots" @@ -232,10 +293,10 @@ opensearch_tmp_dir.mkdirs() test { retry { if (BuildParams.isCi()) { - failOnPassedAfterRetry = false maxRetries = 6 maxFailures = 10 } + failOnPassedAfterRetry = false } include '**/*Tests.class' systemProperty 'tests.security.manager', 'false' @@ -257,10 +318,10 @@ tasks.named("check").configure { dependsOn(integTest) } integTest { retry { if (BuildParams.isCi()) { - failOnPassedAfterRetry = false maxRetries = 6 maxFailures = 10 } + failOnPassedAfterRetry = false } dependsOn "bundlePlugin" systemProperty 'tests.security.manager', 'false' @@ -309,6 +370,7 @@ integTest { getClusters().forEach { cluster -> cluster.waitForAllConditions() } + println 'Running in CI mode:' + BuildParams.isCi() } // The --debug-jvm command-line option makes the cluster debuggable; this makes the tests debuggable @@ -339,21 +401,13 @@ testClusters.integTest { } } plugin(project.tasks.bundlePlugin.archiveFile) - plugin(provider(new Callable(){ @Override RegularFile call() throws Exception { return new RegularFile() { @Override File getAsFile() { - if (new File("$project.rootDir/$js_resource_folder").exists()) { - project.delete(files("$project.rootDir/$js_resource_folder")) - } - project.mkdir js_resource_folder - ant.get(src: job_scheduler_build_download, - dest: js_resource_folder, - httpusecaches: false) - return fileTree(js_resource_folder).getSingleFile() + return configurations.zipArchive.asFileTree.getSingleFile() } } } @@ -409,87 +463,48 @@ task integTestRemote(type: RestIntegTestTask) { } } -String bwcMinVersion = "1.3.2.0" -String bwcBundleVersion = "1.3.2.0" -Boolean bwcBundleTest = (project.findProperty('customDistributionDownloadType') != null && - project.properties['customDistributionDownloadType'] == "bundle"); -String bwcVersion = bwcBundleTest ? bwcBundleVersion : bwcMinVersion -String currentBundleVersion = opensearch_version.replace("-SNAPSHOT","") -String baseName = "adBwcCluster" -String bwcFilePath = "src/test/resources/org/opensearch/ad/bwc/" -String bwcJobSchedulerPath = bwcFilePath + "job-scheduler/" -String bwcAnomalyDetectionPath = bwcFilePath + "anomaly-detection/" - -2.times {i -> +2.times {i -> testClusters { "${baseName}$i" { testDistribution = "ARCHIVE" + versions = [bwcVersionShort, opensearch_version] numberOfNodes = 3 - if (bwcBundleTest) { - versions = ["1.3.2", currentBundleVersion] - nodes.each { node -> - node.extraConfigFile("kirk.pem", file("src/test/resources/security/kirk.pem")) - node.extraConfigFile("kirk-key.pem", file("src/test/resources/security/kirk-key.pem")) - node.extraConfigFile("esnode.pem", file("src/test/resources/security/esnode.pem")) - node.extraConfigFile("esnode-key.pem", file("src/test/resources/security/esnode-key.pem")) - node.extraConfigFile("root-ca.pem", file("src/test/resources/security/root-ca.pem")) - node.setting("plugins.security.disabled", "true") - node.setting("plugins.security.ssl.transport.pemcert_filepath", "esnode.pem") - node.setting("plugins.security.ssl.transport.pemkey_filepath", "esnode-key.pem") - node.setting("plugins.security.ssl.transport.pemtrustedcas_filepath", "root-ca.pem") - node.setting("plugins.security.ssl.transport.enforce_hostname_verification", "false") - node.setting("plugins.security.ssl.http.enabled", "true") - node.setting("plugins.security.ssl.http.pemcert_filepath", "esnode.pem") - node.setting("plugins.security.ssl.http.pemkey_filepath", "esnode-key.pem") - node.setting("plugins.security.ssl.http.pemtrustedcas_filepath", "root-ca.pem") - node.setting("plugins.security.allow_unsafe_democertificates", "true") - node.setting("plugins.security.allow_default_init_securityindex", "true") - node.setting("plugins.security.authcz.admin_dn", "CN=kirk,OU=client,O=client,L=test,C=de") - node.setting("plugins.security.audit.type", "internal_elasticsearch") - node.setting("plugins.security.enable_snapshot_restore_privilege", "true") - node.setting("plugins.security.check_snapshot_restore_write_privileges", "true") - node.setting("plugins.security.restapi.roles_enabled", "[\"all_access\", \"security_rest_api_access\"]") - node.setting("plugins.security.system_indices.enabled", "true") - } - } else { - versions = ["1.3.2", opensearch_version] - plugin(provider(new Callable(){ - @Override - RegularFile call() throws Exception { - return new RegularFile() { - @Override - File getAsFile() { - if (new File("$project.rootDir/$bwcFilePath/job-scheduler/$bwcVersion").exists()) { - project.delete(files("$project.rootDir/$bwcFilePath/job-scheduler/$bwcVersion")) - } - project.mkdir bwcJobSchedulerPath + bwcVersion - ant.get(src: bwcOpenSearchJSDownload, - dest: bwcJobSchedulerPath + bwcVersion, - httpusecaches: false) - return fileTree(bwcJobSchedulerPath + bwcVersion).getSingleFile() + plugin(provider(new Callable(){ + @Override + RegularFile call() throws Exception { + return new RegularFile() { + @Override + File getAsFile() { + if (new File("$project.rootDir/$bwcFilePath/job-scheduler/$bwcVersion").exists()) { + project.delete(files("$project.rootDir/$bwcFilePath/job-scheduler/$bwcVersion")) } + project.mkdir bwcJobSchedulerPath + bwcVersion + ant.get(src: bwcOpenSearchJSDownload, + dest: bwcJobSchedulerPath + bwcVersion, + httpusecaches: false) + return fileTree(bwcJobSchedulerPath + bwcVersion).getSingleFile() } } - })) - plugin(provider(new Callable(){ - @Override - RegularFile call() throws Exception { - return new RegularFile() { - @Override - File getAsFile() { - if (new File("$project.rootDir/$bwcFilePath/anomaly-detection/$bwcVersion").exists()) { - project.delete(files("$project.rootDir/$bwcFilePath/anomaly-detection/$bwcVersion")) - } - project.mkdir bwcAnomalyDetectionPath + bwcVersion - ant.get(src: bwcOpenSearchADDownload, - dest: bwcAnomalyDetectionPath + bwcVersion, - httpusecaches: false) - return fileTree(bwcAnomalyDetectionPath + bwcVersion).getSingleFile() + } + })) + plugin(provider(new Callable(){ + @Override + RegularFile call() throws Exception { + return new RegularFile() { + @Override + File getAsFile() { + if (new File("$project.rootDir/$bwcFilePath/anomaly-detection/$bwcVersion").exists()) { + project.delete(files("$project.rootDir/$bwcFilePath/anomaly-detection/$bwcVersion")) } + project.mkdir bwcAnomalyDetectionPath + bwcVersion + ant.get(src: bwcOpenSearchADDownload, + dest: bwcAnomalyDetectionPath + bwcVersion, + httpusecaches: false) + return fileTree(bwcAnomalyDetectionPath + bwcVersion).getSingleFile() } } - })) - } + } + })) setting 'path.repo', "${buildDir}/cluster/shared/repo/${baseName}" setting 'http.content_type.required', 'true' } @@ -503,14 +518,7 @@ List> plugins = [ return new RegularFile() { @Override File getAsFile() { - if (new File("$project.rootDir/$bwcFilePath/job-scheduler/$opensearch_build").exists()) { - project.delete(files("$project.rootDir/$bwcFilePath/job-scheduler/$opensearch_build")) - } - project.mkdir bwcJobSchedulerPath + opensearch_build - ant.get(src: job_scheduler_build_download, - dest: bwcJobSchedulerPath + opensearch_build, - httpusecaches: false) - return fileTree(bwcJobSchedulerPath + opensearch_build).getSingleFile() + return configurations.zipArchive.asFileTree.getSingleFile() } } } @@ -521,15 +529,15 @@ List> plugins = [ return new RegularFile() { @Override File getAsFile() { - return fileTree(bwcFilePath + "anomaly-detection/" + project.version).getSingleFile() - } + return fileTree(bwcFilePath + "anomaly-detection/" + project.version).getSingleFile() + } } } }) ] -// Creates 2 test clusters with 3 nodes of the old version. -2.times {i -> +// Creates 2 test clusters with 3 nodes of the old version. +2.times {i -> task "${baseName}#oldVersionClusterTask$i"(type: StandaloneRestIntegTestTask) { useCluster testClusters."${baseName}$i" filter { @@ -540,25 +548,18 @@ List> plugins = [ systemProperty 'tests.plugin_bwc_version', bwcVersion nonInputProperties.systemProperty('tests.rest.cluster', "${-> testClusters."${baseName}$i".allHttpSocketURI.join(",")}") nonInputProperties.systemProperty('tests.clustername', "${-> testClusters."${baseName}$i".getName()}") - } + } } -// Upgrades one node of the old cluster to new OpenSearch version with upgraded plugin version +// Upgrades one node of the old cluster to new OpenSearch version with upgraded plugin version // This results in a mixed cluster with 2 nodes on the old version and 1 upgraded node. // This is also used as a one third upgraded cluster for a rolling upgrade. task "${baseName}#mixedClusterTask"(type: StandaloneRestIntegTestTask) { useCluster testClusters."${baseName}0" dependsOn "${baseName}#oldVersionClusterTask0" - if (bwcBundleTest){ - doFirst { - testClusters."${baseName}0".nextNodeToNextVersion() - } - } else { - doFirst { - testClusters."${baseName}0".upgradeNodeAndPluginToNextVersion(plugins) - } + doFirst { + testClusters."${baseName}0".upgradeNodeAndPluginToNextVersion(plugins) } - filter { includeTestsMatching "org.opensearch.ad.bwc.*IT" } @@ -575,16 +576,8 @@ task "${baseName}#mixedClusterTask"(type: StandaloneRestIntegTestTask) { task "${baseName}#twoThirdsUpgradedClusterTask"(type: StandaloneRestIntegTestTask) { dependsOn "${baseName}#mixedClusterTask" useCluster testClusters."${baseName}0" - if (bwcBundleTest){ - println 'Running in bwcBundleTest mode:' + BuildParams.isCi() - doFirst { - testClusters."${baseName}0".nextNodeToNextVersion() - } - } else { - println 'Running in CI mode:' + BuildParams.isCi() - doFirst { - testClusters."${baseName}0".upgradeNodeAndPluginToNextVersion(plugins) - } + doFirst { + testClusters."${baseName}0".upgradeNodeAndPluginToNextVersion(plugins) } filter { includeTestsMatching "org.opensearch.ad.bwc.*IT" @@ -602,16 +595,9 @@ task "${baseName}#twoThirdsUpgradedClusterTask"(type: StandaloneRestIntegTestTas task "${baseName}#rollingUpgradeClusterTask"(type: StandaloneRestIntegTestTask) { dependsOn "${baseName}#twoThirdsUpgradedClusterTask" useCluster testClusters."${baseName}0" - if (bwcBundleTest){ - doFirst { - testClusters."${baseName}0".nextNodeToNextVersion() - } - } else { - doFirst { - testClusters."${baseName}0".upgradeNodeAndPluginToNextVersion(plugins) - } + doFirst { + testClusters."${baseName}0".upgradeNodeAndPluginToNextVersion(plugins) } - filter { includeTestsMatching "org.opensearch.ad.bwc.*IT" } @@ -623,19 +609,13 @@ task "${baseName}#rollingUpgradeClusterTask"(type: StandaloneRestIntegTestTask) nonInputProperties.systemProperty('tests.clustername', "${-> testClusters."${baseName}0".getName()}") } -// Upgrades all the nodes of the old cluster to new OpenSearch version with upgraded plugin version +// Upgrades all the nodes of the old cluster to new OpenSearch version with upgraded plugin version // at the same time resulting in a fully upgraded cluster. task "${baseName}#fullRestartClusterTask"(type: StandaloneRestIntegTestTask) { dependsOn "${baseName}#oldVersionClusterTask1" useCluster testClusters."${baseName}1" - if (bwcBundleTest){ - doFirst { - testClusters."${baseName}1".goToNextVersion() - } - } else { - doFirst { - testClusters."${baseName}1".upgradeAllNodesAndPluginsToNextVersion(plugins) - } + doFirst { + testClusters."${baseName}1".upgradeAllNodesAndPluginsToNextVersion(plugins) } filter { includeTestsMatching "org.opensearch.ad.bwc.*IT" @@ -680,7 +660,7 @@ task release(type: Copy, group: 'build') { List jacocoExclusions = [ // code for configuration, settings, etc is excluded from coverage - 'org.opensearch.ad.AnomalyDetectorPlugin', + 'org.opensearch.timeseries.TimeSeriesAnalyticsPlugin', // rest layer is tested in integration testing mostly, difficult to mock all of it 'org.opensearch.ad.rest.*', @@ -692,9 +672,10 @@ List jacocoExclusions = [ // Class containing just constants. Don't need to test 'org.opensearch.ad.constant.*', - - //'org.opensearch.ad.common.exception.AnomalyDetectionException', - 'org.opensearch.ad.util.ClientUtil', + 'org.opensearch.forecast.constant.*', + 'org.opensearch.timeseries.constant.*', + 'org.opensearch.timeseries.settings.TimeSeriesSettings', + 'org.opensearch.forecast.settings.ForecastSettings', 'org.opensearch.ad.transport.CronRequest', 'org.opensearch.ad.AnomalyDetectorRunner', @@ -735,64 +716,14 @@ jacocoTestCoverageVerification { jacocoTestReport { reports { - xml.required = true - html.required = true + xml.required = true // for coverlay + html.required = true // human readable } } check.dependsOn jacocoTestCoverageVerification jacocoTestCoverageVerification.dependsOn jacocoTestReport -dependencies { - implementation "org.opensearch:opensearch:${opensearch_version}" - compileOnly "org.opensearch.plugin:opensearch-scripting-painless-spi:${opensearch_version}" - compileOnly "org.opensearch:opensearch-job-scheduler-spi:${job_scheduler_version}" - implementation "org.opensearch:common-utils:${common_utils_version}" - implementation "org.opensearch.client:opensearch-rest-client:${opensearch_version}" - compileOnly group: 'com.google.guava', name: 'guava', version:'32.1.2-jre' - compileOnly group: 'com.google.guava', name: 'failureaccess', version:'1.0.1' - implementation group: 'org.javassist', name: 'javassist', version:'3.28.0-GA' - implementation group: 'org.apache.commons', name: 'commons-math3', version: '3.6.1' - implementation group: 'com.google.code.gson', name: 'gson', version: '2.8.9' - implementation group: 'com.yahoo.datasketches', name: 'sketches-core', version: '0.13.4' - implementation group: 'com.yahoo.datasketches', name: 'memory', version: '0.12.2' - implementation group: 'commons-lang', name: 'commons-lang', version: '2.6' - implementation group: 'org.apache.commons', name: 'commons-pool2', version: '2.10.0' - implementation 'software.amazon.randomcutforest:randomcutforest-serialization:3.8.0' - implementation 'software.amazon.randomcutforest:randomcutforest-parkservices:3.8.0' - implementation 'software.amazon.randomcutforest:randomcutforest-core:3.8.0' - - // force Jackson version to avoid version conflict issue - implementation "com.fasterxml.jackson.core:jackson-databind:2.16.1" - implementation "com.fasterxml.jackson.core:jackson-annotations:2.16.1" - - // used for serializing/deserializing rcf models. - implementation group: 'io.protostuff', name: 'protostuff-core', version: '1.8.0' - implementation group: 'io.protostuff', name: 'protostuff-runtime', version: '1.8.0' - implementation group: 'io.protostuff', name: 'protostuff-api', version: '1.8.0' - implementation group: 'io.protostuff', name: 'protostuff-collectionschema', version: '1.8.0' - implementation group: 'org.apache.commons', name: 'commons-lang3', version: '3.13.0' - - implementation "org.jacoco:org.jacoco.agent:0.8.5" - implementation ("org.jacoco:org.jacoco.ant:0.8.5") { - exclude group: 'org.ow2.asm', module: 'asm-commons' - exclude group: 'org.ow2.asm', module: 'asm' - exclude group: 'org.ow2.asm', module: 'asm-tree' - } - - testImplementation group: 'pl.pragmatists', name: 'JUnitParams', version: '1.1.1' - testImplementation group: 'org.mockito', name: 'mockito-core', version: '5.9.0' - testImplementation group: 'org.objenesis', name: 'objenesis', version: '3.3' - testImplementation group: 'net.bytebuddy', name: 'byte-buddy', version: '1.14.9' - testImplementation group: 'net.bytebuddy', name: 'byte-buddy-agent', version: '1.14.9' - testCompileOnly 'org.apiguardian:apiguardian-api:1.1.0' - testImplementation 'org.junit.jupiter:junit-jupiter-api:5.8.2' - testImplementation 'org.junit.jupiter:junit-jupiter-params:5.8.2' - testImplementation 'org.junit.jupiter:junit-jupiter-engine:5.8.2' - testRuntimeOnly 'org.junit.vintage:junit-vintage-engine:5.8.2' - testCompileOnly 'junit:junit:4.13.2' -} - compileJava.options.compilerArgs << "-Xlint:-deprecation,-rawtypes,-serial,-try,-unchecked" apply plugin: 'com.netflix.nebula.ospackage' @@ -805,7 +736,7 @@ afterEvaluate { version = "${project.version}" - "-SNAPSHOT" into '/usr/share/opensearch/plugins' - from(zipTree(bundlePlugin.archivePath)) { + from(zipTree(bundlePlugin.archiveFile)) { into opensearchplugin.name } @@ -895,12 +826,3 @@ task updateVersion { ant.replaceregexp(file:'build.gradle', match: '"opensearch.version", "\\d.*"', replace: '"opensearch.version", "' + newVersion.tokenize('-')[0] + '-SNAPSHOT"', flags:'g', byline:true) } } - -tasks.withType(AbstractPublishToMaven) { - def predicate = provider { - publication.name == "pluginZip" - } - onlyIf("Publishing only ZIP distributions") { - predicate.get() - } -} diff --git a/gradle.properties b/gradle.properties new file mode 100644 index 000000000..d2eba77fc --- /dev/null +++ b/gradle.properties @@ -0,0 +1,30 @@ +# +# Copyright OpenSearch Contributors +# SPDX-License-Identifier: Apache-2.0 +# + +# Enable build caching +org.gradle.caching=true +org.gradle.warning.mode=none +org.gradle.parallel=true +org.gradle.jvmargs=-Xmx3g -XX:+HeapDumpOnOutOfMemoryError -Xss2m \ + --add-exports jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED \ + --add-exports jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED \ + --add-exports jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED \ + --add-exports jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED \ + --add-exports jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED +options.forkOptions.memoryMaximumSize=3g + +# Disable duplicate project id detection +# See https://docs.gradle.org/current/userguide/upgrading_version_6.html#duplicate_project_names_may_cause_publication_to_fail +systemProp.org.gradle.dependency.duplicate.project.detection=false + +# Enforce the build to fail on deprecated gradle api usage +systemProp.org.gradle.warning.mode=fail + +# forcing to use TLS1.2 to avoid failure in vault +# see https://github.com/hashicorp/vault/issues/8750#issuecomment-631236121 +systemProp.jdk.tls.client.protocols=TLSv1.2 + +# jvm args for faster test execution by default +systemProp.tests.jvm.argline=-XX:TieredStopAtLevel=1 -XX:ReservedCodeCacheSize=64m diff --git a/gradle/wrapper/gradle-wrapper.jar b/gradle/wrapper/gradle-wrapper.jar index 7f93135c49b765f8051ef9d0a6055ff8e46073d8..d64cd4917707c1f8861d8cb53dd15194d4248596 100644 GIT binary patch literal 43462 zcma&NWl&^owk(X(xVyW%ySuwf;qI=D6|RlDJ2cR^yEKh!@I- zp9QeisK*rlxC>+~7Dk4IxIRsKBHqdR9b3+fyL=ynHmIDe&|>O*VlvO+%z5;9Z$|DJ zb4dO}-R=MKr^6EKJiOrJdLnCJn>np?~vU-1sSFgPu;pthGwf}bG z(1db%xwr#x)r+`4AGu$j7~u2MpVs3VpLp|mx&;>`0p0vH6kF+D2CY0fVdQOZ@h;A` z{infNyvmFUiu*XG}RNMNwXrbec_*a3N=2zJ|Wh5z* z5rAX$JJR{#zP>KY**>xHTuw?|-Rg|o24V)74HcfVT;WtQHXlE+_4iPE8QE#DUm%x0 zEKr75ur~W%w#-My3Tj`hH6EuEW+8K-^5P62$7Sc5OK+22qj&Pd1;)1#4tKihi=~8C zHiQSst0cpri6%OeaR`PY>HH_;CPaRNty%WTm4{wDK8V6gCZlG@U3$~JQZ;HPvDJcT1V{ z?>H@13MJcCNe#5z+MecYNi@VT5|&UiN1D4ATT+%M+h4c$t;C#UAs3O_q=GxK0}8%8 z8J(_M9bayxN}69ex4dzM_P3oh@ZGREjVvn%%r7=xjkqxJP4kj}5tlf;QosR=%4L5y zWhgejO=vao5oX%mOHbhJ8V+SG&K5dABn6!WiKl{|oPkq(9z8l&Mm%(=qGcFzI=eLu zWc_oCLyf;hVlB@dnwY98?75B20=n$>u3b|NB28H0u-6Rpl((%KWEBOfElVWJx+5yg z#SGqwza7f}$z;n~g%4HDU{;V{gXIhft*q2=4zSezGK~nBgu9-Q*rZ#2f=Q}i2|qOp z!!y4p)4o=LVUNhlkp#JL{tfkhXNbB=Ox>M=n6soptJw-IDI|_$is2w}(XY>a=H52d z3zE$tjPUhWWS+5h=KVH&uqQS=$v3nRs&p$%11b%5qtF}S2#Pc`IiyBIF4%A!;AVoI zXU8-Rpv!DQNcF~(qQnyyMy=-AN~U>#&X1j5BLDP{?K!%h!;hfJI>$mdLSvktEr*89 zdJHvby^$xEX0^l9g$xW-d?J;L0#(`UT~zpL&*cEh$L|HPAu=P8`OQZV!-}l`noSp_ zQ-1$q$R-gDL)?6YaM!=8H=QGW$NT2SeZlb8PKJdc=F-cT@j7Xags+Pr*jPtlHFnf- zh?q<6;)27IdPc^Wdy-mX%2s84C1xZq9Xms+==F4);O`VUASmu3(RlgE#0+#giLh-& zcxm3_e}n4{%|X zJp{G_j+%`j_q5}k{eW&TlP}J2wtZ2^<^E(O)4OQX8FDp6RJq!F{(6eHWSD3=f~(h} zJXCf7=r<16X{pHkm%yzYI_=VDP&9bmI1*)YXZeB}F? z(%QsB5fo*FUZxK$oX~X^69;x~j7ms8xlzpt-T15e9}$4T-pC z6PFg@;B-j|Ywajpe4~bk#S6(fO^|mm1hKOPfA%8-_iGCfICE|=P_~e;Wz6my&)h_~ zkv&_xSAw7AZ%ThYF(4jADW4vg=oEdJGVOs>FqamoL3Np8>?!W#!R-0%2Bg4h?kz5I zKV-rKN2n(vUL%D<4oj@|`eJ>0i#TmYBtYmfla;c!ATW%;xGQ0*TW@PTlGG><@dxUI zg>+3SiGdZ%?5N=8uoLA|$4isK$aJ%i{hECP$bK{J#0W2gQ3YEa zZQ50Stn6hqdfxJ*9#NuSLwKFCUGk@c=(igyVL;;2^wi4o30YXSIb2g_ud$ zgpCr@H0qWtk2hK8Q|&wx)}4+hTYlf;$a4#oUM=V@Cw#!$(nOFFpZ;0lc!qd=c$S}Z zGGI-0jg~S~cgVT=4Vo)b)|4phjStD49*EqC)IPwyeKBLcN;Wu@Aeph;emROAwJ-0< z_#>wVm$)ygH|qyxZaet&(Vf%pVdnvKWJn9`%DAxj3ot;v>S$I}jJ$FLBF*~iZ!ZXE zkvui&p}fI0Y=IDX)mm0@tAd|fEHl~J&K}ZX(Mm3cm1UAuwJ42+AO5@HwYfDH7ipIc zmI;1J;J@+aCNG1M`Btf>YT>~c&3j~Qi@Py5JT6;zjx$cvOQW@3oQ>|}GH?TW-E z1R;q^QFjm5W~7f}c3Ww|awg1BAJ^slEV~Pk`Kd`PS$7;SqJZNj->it4DW2l15}xP6 zoCl$kyEF%yJni0(L!Z&14m!1urXh6Btj_5JYt1{#+H8w?5QI%% zo-$KYWNMJVH?Hh@1n7OSu~QhSswL8x0=$<8QG_zepi_`y_79=nK=_ZP_`Em2UI*tyQoB+r{1QYZCpb?2OrgUw#oRH$?^Tj!Req>XiE#~B|~ z+%HB;=ic+R@px4Ld8mwpY;W^A%8%l8$@B@1m5n`TlKI6bz2mp*^^^1mK$COW$HOfp zUGTz-cN9?BGEp}5A!mDFjaiWa2_J2Iq8qj0mXzk; z66JBKRP{p%wN7XobR0YjhAuW9T1Gw3FDvR5dWJ8ElNYF94eF3ebu+QwKjtvVu4L zI9ip#mQ@4uqVdkl-TUQMb^XBJVLW(-$s;Nq;@5gr4`UfLgF$adIhd?rHOa%D);whv z=;krPp~@I+-Z|r#s3yCH+c1US?dnm+C*)r{m+86sTJusLdNu^sqLrfWed^ndHXH`m zd3#cOe3>w-ga(Dus_^ppG9AC>Iq{y%%CK+Cro_sqLCs{VLuK=dev>OL1dis4(PQ5R zcz)>DjEkfV+MO;~>VUlYF00SgfUo~@(&9$Iy2|G0T9BSP?&T22>K46D zL*~j#yJ?)^*%J3!16f)@Y2Z^kS*BzwfAQ7K96rFRIh>#$*$_Io;z>ux@}G98!fWR@ zGTFxv4r~v)Gsd|pF91*-eaZ3Qw1MH$K^7JhWIdX%o$2kCbvGDXy)a?@8T&1dY4`;L z4Kn+f%SSFWE_rpEpL9bnlmYq`D!6F%di<&Hh=+!VI~j)2mfil03T#jJ_s?}VV0_hp z7T9bWxc>Jm2Z0WMU?`Z$xE74Gu~%s{mW!d4uvKCx@WD+gPUQ zV0vQS(Ig++z=EHN)BR44*EDSWIyT~R4$FcF*VEY*8@l=218Q05D2$|fXKFhRgBIEE zdDFB}1dKkoO^7}{5crKX!p?dZWNz$m>1icsXG2N+((x0OIST9Zo^DW_tytvlwXGpn zs8?pJXjEG;T@qrZi%#h93?FP$!&P4JA(&H61tqQi=opRzNpm zkrG}$^t9&XduK*Qa1?355wd8G2CI6QEh@Ua>AsD;7oRUNLPb76m4HG3K?)wF~IyS3`fXuNM>${?wmB zpVz;?6_(Fiadfd{vUCBM*_kt$+F3J+IojI;9L(gc9n3{sEZyzR9o!_mOwFC#tQ{Q~ zP3-`#uK#tP3Q7~Q;4H|wjZHO8h7e4IuBxl&vz2w~D8)w=Wtg31zpZhz%+kzSzL*dV zwp@{WU4i;hJ7c2f1O;7Mz6qRKeASoIv0_bV=i@NMG*l<#+;INk-^`5w@}Dj~;k=|}qM1vq_P z|GpBGe_IKq|LNy9SJhKOQ$c=5L{Dv|Q_lZl=-ky*BFBJLW9&y_C|!vyM~rQx=!vun z?rZJQB5t}Dctmui5i31C_;_}CEn}_W%>oSXtt>@kE1=JW*4*v4tPp;O6 zmAk{)m!)}34pTWg8{i>($%NQ(Tl;QC@J@FfBoc%Gr&m560^kgSfodAFrIjF}aIw)X zoXZ`@IsMkc8_=w%-7`D6Y4e*CG8k%Ud=GXhsTR50jUnm+R*0A(O3UKFg0`K;qp1bl z7``HN=?39ic_kR|^R^~w-*pa?Vj#7|e9F1iRx{GN2?wK!xR1GW!qa=~pjJb-#u1K8 zeR?Y2i-pt}yJq;SCiVHODIvQJX|ZJaT8nO+(?HXbLefulKKgM^B(UIO1r+S=7;kLJ zcH}1J=Px2jsh3Tec&v8Jcbng8;V-`#*UHt?hB(pmOipKwf3Lz8rG$heEB30Sg*2rx zV<|KN86$soN(I!BwO`1n^^uF2*x&vJ$2d$>+`(romzHP|)K_KkO6Hc>_dwMW-M(#S zK(~SiXT1@fvc#U+?|?PniDRm01)f^#55;nhM|wi?oG>yBsa?~?^xTU|fX-R(sTA+5 zaq}-8Tx7zrOy#3*JLIIVsBmHYLdD}!0NP!+ITW+Thn0)8SS!$@)HXwB3tY!fMxc#1 zMp3H?q3eD?u&Njx4;KQ5G>32+GRp1Ee5qMO0lZjaRRu&{W<&~DoJNGkcYF<5(Ab+J zgO>VhBl{okDPn78<%&e2mR{jwVCz5Og;*Z;;3%VvoGo_;HaGLWYF7q#jDX=Z#Ml`H z858YVV$%J|e<1n`%6Vsvq7GmnAV0wW4$5qQ3uR@1i>tW{xrl|ExywIc?fNgYlA?C5 zh$ezAFb5{rQu6i7BSS5*J-|9DQ{6^BVQ{b*lq`xS@RyrsJN?-t=MTMPY;WYeKBCNg z^2|pN!Q^WPJuuO4!|P@jzt&tY1Y8d%FNK5xK(!@`jO2aEA*4 zkO6b|UVBipci?){-Ke=+1;mGlND8)6+P;8sq}UXw2hn;fc7nM>g}GSMWu&v&fqh

iViYT=fZ(|3Ox^$aWPp4a8h24tD<|8-!aK0lHgL$N7Efw}J zVIB!7=T$U`ao1?upi5V4Et*-lTG0XvExbf!ya{cua==$WJyVG(CmA6Of*8E@DSE%L z`V^$qz&RU$7G5mg;8;=#`@rRG`-uS18$0WPN@!v2d{H2sOqP|!(cQ@ zUHo!d>>yFArLPf1q`uBvY32miqShLT1B@gDL4XoVTK&@owOoD)OIHXrYK-a1d$B{v zF^}8D3Y^g%^cnvScOSJR5QNH+BI%d|;J;wWM3~l>${fb8DNPg)wrf|GBP8p%LNGN# z3EaIiItgwtGgT&iYCFy9-LG}bMI|4LdmmJt@V@% zb6B)1kc=T)(|L@0;wr<>=?r04N;E&ef+7C^`wPWtyQe(*pD1pI_&XHy|0gIGHMekd zF_*M4yi6J&Z4LQj65)S zXwdM{SwUo%3SbPwFsHgqF@V|6afT|R6?&S;lw=8% z3}@9B=#JI3@B*#4s!O))~z zc>2_4Q_#&+5V`GFd?88^;c1i7;Vv_I*qt!_Yx*n=;rj!82rrR2rQ8u5(Ejlo{15P% zs~!{%XJ>FmJ})H^I9bn^Re&38H{xA!0l3^89k(oU;bZWXM@kn$#aoS&Y4l^-WEn-fH39Jb9lA%s*WsKJQl?n9B7_~P z-XM&WL7Z!PcoF6_D>V@$CvUIEy=+Z&0kt{szMk=f1|M+r*a43^$$B^MidrT0J;RI` z(?f!O<8UZkm$_Ny$Hth1J#^4ni+im8M9mr&k|3cIgwvjAgjH z8`N&h25xV#v*d$qBX5jkI|xOhQn!>IYZK7l5#^P4M&twe9&Ey@@GxYMxBZq2e7?`q z$~Szs0!g{2fGcp9PZEt|rdQ6bhAgpcLHPz?f-vB?$dc*!9OL?Q8mn7->bFD2Si60* z!O%y)fCdMSV|lkF9w%x~J*A&srMyYY3{=&$}H zGQ4VG_?$2X(0|vT0{=;W$~icCI{b6W{B!Q8xdGhF|D{25G_5_+%s(46lhvNLkik~R z>nr(&C#5wwOzJZQo9m|U<;&Wk!_#q|V>fsmj1g<6%hB{jGoNUPjgJslld>xmODzGjYc?7JSuA?A_QzjDw5AsRgi@Y|Z0{F{!1=!NES-#*f^s4l0Hu zz468))2IY5dmD9pa*(yT5{EyP^G>@ZWumealS-*WeRcZ}B%gxq{MiJ|RyX-^C1V=0 z@iKdrGi1jTe8Ya^x7yyH$kBNvM4R~`fbPq$BzHum-3Zo8C6=KW@||>zsA8-Y9uV5V z#oq-f5L5}V<&wF4@X@<3^C%ptp6+Ce)~hGl`kwj)bsAjmo_GU^r940Z-|`<)oGnh7 zFF0Tde3>ui?8Yj{sF-Z@)yQd~CGZ*w-6p2U<8}JO-sRsVI5dBji`01W8A&3$?}lxBaC&vn0E$c5tW* zX>5(zzZ=qn&!J~KdsPl;P@bmA-Pr8T*)eh_+Dv5=Ma|XSle6t(k8qcgNyar{*ReQ8 zTXwi=8vr>!3Ywr+BhggHDw8ke==NTQVMCK`$69fhzEFB*4+H9LIvdt-#IbhZvpS}} zO3lz;P?zr0*0$%-Rq_y^k(?I{Mk}h@w}cZpMUp|ucs55bcloL2)($u%mXQw({Wzc~ z;6nu5MkjP)0C(@%6Q_I_vsWrfhl7Zpoxw#WoE~r&GOSCz;_ro6i(^hM>I$8y>`!wW z*U^@?B!MMmb89I}2(hcE4zN2G^kwyWCZp5JG>$Ez7zP~D=J^LMjSM)27_0B_X^C(M z`fFT+%DcKlu?^)FCK>QzSnV%IsXVcUFhFdBP!6~se&xxrIxsvySAWu++IrH;FbcY$ z2DWTvSBRfLwdhr0nMx+URA$j3i7_*6BWv#DXfym?ZRDcX9C?cY9sD3q)uBDR3uWg= z(lUIzB)G$Hr!){>E{s4Dew+tb9kvToZp-1&c?y2wn@Z~(VBhqz`cB;{E4(P3N2*nJ z_>~g@;UF2iG{Kt(<1PyePTKahF8<)pozZ*xH~U-kfoAayCwJViIrnqwqO}7{0pHw$ zs2Kx?s#vQr7XZ264>5RNKSL8|Ty^=PsIx^}QqOOcfpGUU4tRkUc|kc7-!Ae6!+B{o~7nFpm3|G5^=0#Bnm6`V}oSQlrX(u%OWnC zoLPy&Q;1Jui&7ST0~#+}I^&?vcE*t47~Xq#YwvA^6^} z`WkC)$AkNub|t@S!$8CBlwbV~?yp&@9h{D|3z-vJXgzRC5^nYm+PyPcgRzAnEi6Q^gslXYRv4nycsy-SJu?lMps-? zV`U*#WnFsdPLL)Q$AmD|0`UaC4ND07+&UmOu!eHruzV|OUox<+Jl|Mr@6~C`T@P%s zW7sgXLF2SSe9Fl^O(I*{9wsFSYb2l%-;&Pi^dpv!{)C3d0AlNY6!4fgmSgj_wQ*7Am7&$z;Jg&wgR-Ih;lUvWS|KTSg!&s_E9_bXBkZvGiC6bFKDWZxsD$*NZ#_8bl zG1P-#@?OQzED7@jlMJTH@V!6k;W>auvft)}g zhoV{7$q=*;=l{O>Q4a@ ziMjf_u*o^PsO)#BjC%0^h>Xp@;5$p{JSYDt)zbb}s{Kbt!T*I@Pk@X0zds6wsefuU zW$XY%yyRGC94=6mf?x+bbA5CDQ2AgW1T-jVAJbm7K(gp+;v6E0WI#kuACgV$r}6L? zd|Tj?^%^*N&b>Dd{Wr$FS2qI#Ucs1yd4N+RBUQiSZGujH`#I)mG&VKoDh=KKFl4=G z&MagXl6*<)$6P}*Tiebpz5L=oMaPrN+caUXRJ`D?=K9!e0f{@D&cZLKN?iNP@X0aF zE(^pl+;*T5qt?1jRC=5PMgV!XNITRLS_=9{CJExaQj;lt!&pdzpK?8p>%Mb+D z?yO*uSung=-`QQ@yX@Hyd4@CI^r{2oiu`%^bNkz+Nkk!IunjwNC|WcqvX~k=><-I3 zDQdbdb|!v+Iz01$w@aMl!R)koD77Xp;eZwzSl-AT zr@Vu{=xvgfq9akRrrM)}=!=xcs+U1JO}{t(avgz`6RqiiX<|hGG1pmop8k6Q+G_mv zJv|RfDheUp2L3=^C=4aCBMBn0aRCU(DQwX-W(RkRwmLeuJYF<0urcaf(=7)JPg<3P zQs!~G)9CT18o!J4{zX{_e}4eS)U-E)0FAt}wEI(c0%HkxgggW;(1E=>J17_hsH^sP z%lT0LGgbUXHx-K*CI-MCrP66UP0PvGqM$MkeLyqHdbgP|_Cm!7te~b8p+e6sQ_3k| zVcwTh6d83ltdnR>D^)BYQpDKlLk3g0Hdcgz2}%qUs9~~Rie)A-BV1mS&naYai#xcZ z(d{8=-LVpTp}2*y)|gR~;qc7fp26}lPcLZ#=JpYcn3AT9(UIdOyg+d(P5T7D&*P}# zQCYplZO5|7+r19%9e`v^vfSS1sbX1c%=w1;oyruXB%Kl$ACgKQ6=qNWLsc=28xJjg zwvsI5-%SGU|3p>&zXVl^vVtQT3o-#$UT9LI@Npz~6=4!>mc431VRNN8od&Ul^+G_kHC`G=6WVWM z%9eWNyy(FTO|A+@x}Ou3CH)oi;t#7rAxdIXfNFwOj_@Y&TGz6P_sqiB`Q6Lxy|Q{`|fgmRG(k+!#b*M+Z9zFce)f-7;?Km5O=LHV9f9_87; zF7%R2B+$?@sH&&-$@tzaPYkw0;=i|;vWdI|Wl3q_Zu>l;XdIw2FjV=;Mq5t1Q0|f< zs08j54Bp`3RzqE=2enlkZxmX6OF+@|2<)A^RNQpBd6o@OXl+i)zO%D4iGiQNuXd+zIR{_lb96{lc~bxsBveIw6umhShTX+3@ZJ=YHh@ zWY3(d0azg;7oHn>H<>?4@*RQbi>SmM=JrHvIG(~BrvI)#W(EAeO6fS+}mxxcc+X~W6&YVl86W9WFSS}Vz-f9vS?XUDBk)3TcF z8V?$4Q)`uKFq>xT=)Y9mMFVTUk*NIA!0$?RP6Ig0TBmUFrq*Q-Agq~DzxjStQyJ({ zBeZ;o5qUUKg=4Hypm|}>>L=XKsZ!F$yNTDO)jt4H0gdQ5$f|d&bnVCMMXhNh)~mN z@_UV6D7MVlsWz+zM+inZZp&P4fj=tm6fX)SG5H>OsQf_I8c~uGCig$GzuwViK54bcgL;VN|FnyQl>Ed7(@>=8$a_UKIz|V6CeVSd2(P z0Uu>A8A+muM%HLFJQ9UZ5c)BSAv_zH#1f02x?h9C}@pN@6{>UiAp>({Fn(T9Q8B z^`zB;kJ5b`>%dLm+Ol}ty!3;8f1XDSVX0AUe5P#@I+FQ-`$(a;zNgz)4x5hz$Hfbg z!Q(z26wHLXko(1`;(BAOg_wShpX0ixfWq3ponndY+u%1gyX)_h=v1zR#V}#q{au6; z!3K=7fQwnRfg6FXtNQmP>`<;!N137paFS%y?;lb1@BEdbvQHYC{976l`cLqn;b8lp zIDY>~m{gDj(wfnK!lpW6pli)HyLEiUrNc%eXTil|F2s(AY+LW5hkKb>TQ3|Q4S9rr zpDs4uK_co6XPsn_z$LeS{K4jFF`2>U`tbgKdyDne`xmR<@6AA+_hPNKCOR-Zqv;xk zu5!HsBUb^!4uJ7v0RuH-7?l?}b=w5lzzXJ~gZcxRKOovSk@|#V+MuX%Y+=;14i*%{)_gSW9(#4%)AV#3__kac1|qUy!uyP{>?U#5wYNq}y$S9pCc zFc~4mgSC*G~j0u#qqp9 z${>3HV~@->GqEhr_Xwoxq?Hjn#=s2;i~g^&Hn|aDKpA>Oc%HlW(KA1?BXqpxB;Ydx)w;2z^MpjJ(Qi(X!$5RC z*P{~%JGDQqojV>2JbEeCE*OEu!$XJ>bWA9Oa_Hd;y)F%MhBRi*LPcdqR8X`NQ&1L# z5#9L*@qxrx8n}LfeB^J{%-?SU{FCwiWyHp682F+|pa+CQa3ZLzBqN1{)h4d6+vBbV zC#NEbQLC;}me3eeYnOG*nXOJZEU$xLZ1<1Y=7r0(-U0P6-AqwMAM`a(Ed#7vJkn6plb4eI4?2y3yOTGmmDQ!z9`wzbf z_OY#0@5=bnep;MV0X_;;SJJWEf^E6Bd^tVJ9znWx&Ks8t*B>AM@?;D4oWUGc z!H*`6d7Cxo6VuyS4Eye&L1ZRhrRmN6Lr`{NL(wDbif|y&z)JN>Fl5#Wi&mMIr5i;x zBx}3YfF>>8EC(fYnmpu~)CYHuHCyr5*`ECap%t@y=jD>!_%3iiE|LN$mK9>- zHdtpy8fGZtkZF?%TW~29JIAfi2jZT8>OA7=h;8T{{k?c2`nCEx9$r zS+*&vt~2o^^J+}RDG@+9&M^K*z4p{5#IEVbz`1%`m5c2};aGt=V?~vIM}ZdPECDI)47|CWBCfDWUbxBCnmYivQ*0Nu_xb*C>~C9(VjHM zxe<*D<#dQ8TlpMX2c@M<9$w!RP$hpG4cs%AI){jp*Sj|*`m)5(Bw*A0$*i-(CA5#%>a)$+jI2C9r6|(>J8InryENI z$NohnxDUB;wAYDwrb*!N3noBTKPpPN}~09SEL18tkG zxgz(RYU_;DPT{l?Q$+eaZaxnsWCA^ds^0PVRkIM%bOd|G2IEBBiz{&^JtNsODs;5z zICt_Zj8wo^KT$7Bg4H+y!Df#3mbl%%?|EXe!&(Vmac1DJ*y~3+kRKAD=Ovde4^^%~ zw<9av18HLyrf*_>Slp;^i`Uy~`mvBjZ|?Ad63yQa#YK`4+c6;pW4?XIY9G1(Xh9WO8{F-Aju+nS9Vmv=$Ac0ienZ+p9*O%NG zMZKy5?%Z6TAJTE?o5vEr0r>f>hb#2w2U3DL64*au_@P!J!TL`oH2r*{>ffu6|A7tv zL4juf$DZ1MW5ZPsG!5)`k8d8c$J$o;%EIL0va9&GzWvkS%ZsGb#S(?{!UFOZ9<$a| zY|a+5kmD5N&{vRqkgY>aHsBT&`rg|&kezoD)gP0fsNYHsO#TRc_$n6Lf1Z{?+DLziXlHrq4sf(!>O{?Tj;Eh@%)+nRE_2VxbN&&%%caU#JDU%vL3}Cb zsb4AazPI{>8H&d=jUaZDS$-0^AxE@utGs;-Ez_F(qC9T=UZX=>ok2k2 ziTn{K?y~a5reD2A)P${NoI^>JXn>`IeArow(41c-Wm~)wiryEP(OS{YXWi7;%dG9v zI?mwu1MxD{yp_rrk!j^cKM)dc4@p4Ezyo%lRN|XyD}}>v=Xoib0gOcdXrQ^*61HNj z=NP|pd>@yfvr-=m{8$3A8TQGMTE7g=z!%yt`8`Bk-0MMwW~h^++;qyUP!J~ykh1GO z(FZ59xuFR$(WE;F@UUyE@Sp>`aVNjyj=Ty>_Vo}xf`e7`F;j-IgL5`1~-#70$9_=uBMq!2&1l zomRgpD58@)YYfvLtPW}{C5B35R;ZVvB<<#)x%srmc_S=A7F@DW8>QOEGwD6suhwCg z>Pa+YyULhmw%BA*4yjDp|2{!T98~<6Yfd(wo1mQ!KWwq0eg+6)o1>W~f~kL<-S+P@$wx*zeI|1t7z#Sxr5 zt6w+;YblPQNplq4Z#T$GLX#j6yldXAqj>4gAnnWtBICUnA&-dtnlh=t0Ho_vEKwV` z)DlJi#!@nkYV#$!)@>udAU*hF?V`2$Hf=V&6PP_|r#Iv*J$9)pF@X3`k;5})9^o4y z&)~?EjX5yX12O(BsFy-l6}nYeuKkiq`u9145&3Ssg^y{5G3Pse z9w(YVa0)N-fLaBq1`P!_#>SS(8fh_5!f{UrgZ~uEdeMJIz7DzI5!NHHqQtm~#CPij z?=N|J>nPR6_sL7!f4hD_|KH`vf8(Wpnj-(gPWH+ZvID}%?~68SwhPTC3u1_cB`otq z)U?6qo!ZLi5b>*KnYHWW=3F!p%h1;h{L&(Q&{qY6)_qxNfbP6E3yYpW!EO+IW3?@J z);4>g4gnl^8klu7uA>eGF6rIGSynacogr)KUwE_R4E5Xzi*Qir@b-jy55-JPC8c~( zo!W8y9OGZ&`xmc8;=4-U9=h{vCqfCNzYirONmGbRQlR`WWlgnY+1wCXbMz&NT~9*| z6@FrzP!LX&{no2!Ln_3|I==_4`@}V?4a;YZKTdw;vT<+K+z=uWbW(&bXEaWJ^W8Td z-3&1bY^Z*oM<=M}LVt>_j+p=2Iu7pZmbXrhQ_k)ysE9yXKygFNw$5hwDn(M>H+e1&9BM5!|81vd%r%vEm zqxY3?F@fb6O#5UunwgAHR9jp_W2zZ}NGp2%mTW@(hz7$^+a`A?mb8|_G*GNMJ) zjqegXQio=i@AINre&%ofexAr95aop5C+0MZ0m-l=MeO8m3epm7U%vZB8+I+C*iNFM z#T3l`gknX;D$-`2XT^Cg*vrv=RH+P;_dfF++cP?B_msQI4j+lt&rX2)3GaJx%W*Nn zkML%D{z5tpHH=dksQ*gzc|}gzW;lwAbxoR07VNgS*-c3d&8J|;@3t^ zVUz*J*&r7DFRuFVDCJDK8V9NN5hvpgGjwx+5n)qa;YCKe8TKtdnh{I7NU9BCN!0dq zczrBk8pE{{@vJa9ywR@mq*J=v+PG;?fwqlJVhijG!3VmIKs>9T6r7MJpC)m!Tc#>g zMtVsU>wbwFJEfwZ{vB|ZlttNe83)$iz`~#8UJ^r)lJ@HA&G#}W&ZH*;k{=TavpjWE z7hdyLZPf*X%Gm}i`Y{OGeeu^~nB8=`{r#TUrM-`;1cBvEd#d!kPqIgYySYhN-*1;L z^byj%Yi}Gx)Wnkosi337BKs}+5H5dth1JA{Ir-JKN$7zC)*}hqeoD(WfaUDPT>0`- z(6sa0AoIqASwF`>hP}^|)a_j2s^PQn*qVC{Q}htR z5-)duBFXT_V56-+UohKXlq~^6uf!6sA#ttk1o~*QEy_Y-S$gAvq47J9Vtk$5oA$Ct zYhYJ@8{hsC^98${!#Ho?4y5MCa7iGnfz}b9jE~h%EAAv~Qxu)_rAV;^cygV~5r_~?l=B`zObj7S=H=~$W zPtI_m%g$`kL_fVUk9J@>EiBH zOO&jtn~&`hIFMS5S`g8w94R4H40mdNUH4W@@XQk1sr17b{@y|JB*G9z1|CrQjd+GX z6+KyURG3;!*BQrentw{B2R&@2&`2}n(z-2&X7#r!{yg@Soy}cRD~j zj9@UBW+N|4HW4AWapy4wfUI- zZ`gSL6DUlgj*f1hSOGXG0IVH8HxK?o2|3HZ;KW{K+yPAlxtb)NV_2AwJm|E)FRs&& z=c^e7bvUsztY|+f^k7NXs$o1EUq>cR7C0$UKi6IooHWlK_#?IWDkvywnzg&ThWo^? z2O_N{5X39#?eV9l)xI(>@!vSB{DLt*oY!K1R8}_?%+0^C{d9a%N4 zoxHVT1&Lm|uDX%$QrBun5e-F`HJ^T$ zmzv)p@4ZHd_w9!%Hf9UYNvGCw2TTTbrj9pl+T9%-_-}L(tES>Or-}Z4F*{##n3~L~TuxjirGuIY#H7{%$E${?p{Q01 zi6T`n;rbK1yIB9jmQNycD~yZq&mbIsFWHo|ZAChSFPQa<(%d8mGw*V3fh|yFoxOOiWJd(qvVb!Z$b88cg->N=qO*4k~6;R==|9ihg&riu#P~s4Oap9O7f%crSr^rljeIfXDEg>wi)&v*a%7zpz<9w z*r!3q9J|390x`Zk;g$&OeN&ctp)VKRpDSV@kU2Q>jtok($Y-*x8_$2piTxun81@vt z!Vj?COa0fg2RPXMSIo26T=~0d`{oGP*eV+$!0I<(4azk&Vj3SiG=Q!6mX0p$z7I}; z9BJUFgT-K9MQQ-0@Z=^7R<{bn2Fm48endsSs`V7_@%8?Bxkqv>BDoVcj?K#dV#uUP zL1ND~?D-|VGKe3Rw_7-Idpht>H6XRLh*U7epS6byiGvJpr%d}XwfusjH9g;Z98H`x zyde%%5mhGOiL4wljCaWCk-&uE4_OOccb9c!ZaWt4B(wYl!?vyzl%7n~QepN&eFUrw zFIOl9c({``6~QD+43*_tzP{f2x41h(?b43^y6=iwyB)2os5hBE!@YUS5?N_tXd=h( z)WE286Fbd>R4M^P{!G)f;h<3Q>Fipuy+d2q-)!RyTgt;wr$(?9ox3;q+{E*ZQHhOn;lM`cjnu9 zXa48ks-v(~b*;MAI<>YZH(^NV8vjb34beE<_cwKlJoR;k6lJNSP6v}uiyRD?|0w+X@o1ONrH8a$fCxXpf? z?$DL0)7|X}Oc%h^zrMKWc-NS9I0Utu@>*j}b@tJ=ixQSJ={4@854wzW@E>VSL+Y{i z#0b=WpbCZS>kUCO_iQz)LoE>P5LIG-hv9E+oG}DtlIDF>$tJ1aw9^LuhLEHt?BCj& z(O4I8v1s#HUi5A>nIS-JK{v!7dJx)^Yg%XjNmlkWAq2*cv#tHgz`Y(bETc6CuO1VkN^L-L3j_x<4NqYb5rzrLC-7uOv z!5e`GZt%B782C5-fGnn*GhDF$%(qP<74Z}3xx+{$4cYKy2ikxI7B2N+2r07DN;|-T->nU&!=Cm#rZt%O_5c&1Z%nlWq3TKAW0w zQqemZw_ue--2uKQsx+niCUou?HjD`xhEjjQd3%rrBi82crq*~#uA4+>vR<_S{~5ce z-2EIl?~s z1=GVL{NxP1N3%=AOaC}j_Fv=ur&THz zyO!d9kHq|c73kpq`$+t+8Bw7MgeR5~`d7ChYyGCBWSteTB>8WAU(NPYt2Dk`@#+}= zI4SvLlyk#pBgVigEe`?NG*vl7V6m+<}%FwPV=~PvvA)=#ths==DRTDEYh4V5}Cf$z@#;< zyWfLY_5sP$gc3LLl2x+Ii)#b2nhNXJ{R~vk`s5U7Nyu^3yFg&D%Txwj6QezMX`V(x z=C`{76*mNb!qHHs)#GgGZ_7|vkt9izl_&PBrsu@}L`X{95-2jf99K)0=*N)VxBX2q z((vkpP2RneSIiIUEnGb?VqbMb=Zia+rF~+iqslydE34cSLJ&BJW^3knX@M;t*b=EA zNvGzv41Ld_T+WT#XjDB840vovUU^FtN_)G}7v)1lPetgpEK9YS^OWFkPoE{ovj^=@ zO9N$S=G$1ecndT_=5ehth2Lmd1II-PuT~C9`XVePw$y8J#dpZ?Tss<6wtVglm(Ok7 z3?^oi@pPio6l&!z8JY(pJvG=*pI?GIOu}e^EB6QYk$#FJQ%^AIK$I4epJ+9t?KjqA+bkj&PQ*|vLttme+`9G=L% ziadyMw_7-M)hS(3E$QGNCu|o23|%O+VN7;Qggp?PB3K-iSeBa2b}V4_wY`G1Jsfz4 z9|SdB^;|I8E8gWqHKx!vj_@SMY^hLEIbSMCuE?WKq=c2mJK z8LoG-pnY!uhqFv&L?yEuxo{dpMTsmCn)95xanqBrNPTgXP((H$9N${Ow~Is-FBg%h z53;|Y5$MUN)9W2HBe2TD`ct^LHI<(xWrw}$qSoei?}s)&w$;&!14w6B6>Yr6Y8b)S z0r71`WmAvJJ`1h&poLftLUS6Ir zC$bG9!Im_4Zjse)#K=oJM9mHW1{%l8sz$1o?ltdKlLTxWWPB>Vk22czVt|1%^wnN@*!l)}?EgtvhC>vlHm^t+ogpgHI1_$1ox9e;>0!+b(tBrmXRB`PY1vp-R**8N7 zGP|QqI$m(Rdu#=(?!(N}G9QhQ%o!aXE=aN{&wtGP8|_qh+7a_j_sU5|J^)vxq;# zjvzLn%_QPHZZIWu1&mRAj;Sa_97p_lLq_{~j!M9N^1yp3U_SxRqK&JnR%6VI#^E12 z>CdOVI^_9aPK2eZ4h&^{pQs}xsijXgFYRIxJ~N7&BB9jUR1fm!(xl)mvy|3e6-B3j zJn#ajL;bFTYJ2+Q)tDjx=3IklO@Q+FFM}6UJr6km7hj7th9n_&JR7fnqC!hTZoM~T zBeaVFp%)0cbPhejX<8pf5HyRUj2>aXnXBqDJe73~J%P(2C?-RT{c3NjE`)om! zl$uewSgWkE66$Kb34+QZZvRn`fob~Cl9=cRk@Es}KQm=?E~CE%spXaMO6YmrMl%9Q zlA3Q$3|L1QJ4?->UjT&CBd!~ru{Ih^in&JXO=|<6J!&qp zRe*OZ*cj5bHYlz!!~iEKcuE|;U4vN1rk$xq6>bUWD*u(V@8sG^7>kVuo(QL@Ki;yL zWC!FT(q{E8#on>%1iAS0HMZDJg{Z{^!De(vSIq&;1$+b)oRMwA3nc3mdTSG#3uYO_ z>+x;7p4I;uHz?ZB>dA-BKl+t-3IB!jBRgdvAbW!aJ(Q{aT>+iz?91`C-xbe)IBoND z9_Xth{6?(y3rddwY$GD65IT#f3<(0o#`di{sh2gm{dw*#-Vnc3r=4==&PU^hCv$qd zjw;>i&?L*Wq#TxG$mFIUf>eK+170KG;~+o&1;Tom9}}mKo23KwdEM6UonXgc z!6N(@k8q@HPw{O8O!lAyi{rZv|DpgfU{py+j(X_cwpKqcalcqKIr0kM^%Br3SdeD> zHSKV94Yxw;pjzDHo!Q?8^0bb%L|wC;4U^9I#pd5O&eexX+Im{ z?jKnCcsE|H?{uGMqVie_C~w7GX)kYGWAg%-?8|N_1#W-|4F)3YTDC+QSq1s!DnOML3@d`mG%o2YbYd#jww|jD$gotpa)kntakp#K;+yo-_ZF9qrNZw<%#C zuPE@#3RocLgPyiBZ+R_-FJ_$xP!RzWm|aN)S+{$LY9vvN+IW~Kf3TsEIvP+B9Mtm! zpfNNxObWQpLoaO&cJh5>%slZnHl_Q~(-Tfh!DMz(dTWld@LG1VRF`9`DYKhyNv z2pU|UZ$#_yUx_B_|MxUq^glT}O5Xt(Vm4Mr02><%C)@v;vPb@pT$*yzJ4aPc_FZ3z z3}PLoMBIM>q_9U2rl^sGhk1VUJ89=*?7|v`{!Z{6bqFMq(mYiA?%KbsI~JwuqVA9$H5vDE+VocjX+G^%bieqx->s;XWlKcuv(s%y%D5Xbc9+ zc(_2nYS1&^yL*ey664&4`IoOeDIig}y-E~_GS?m;D!xv5-xwz+G`5l6V+}CpeJDi^ z%4ed$qowm88=iYG+(`ld5Uh&>Dgs4uPHSJ^TngXP_V6fPyl~>2bhi20QB%lSd#yYn zO05?KT1z@?^-bqO8Cg`;ft>ilejsw@2%RR7;`$Vs;FmO(Yr3Fp`pHGr@P2hC%QcA|X&N2Dn zYf`MqXdHi%cGR@%y7Rg7?d3?an){s$zA{!H;Ie5exE#c~@NhQUFG8V=SQh%UxUeiV zd7#UcYqD=lk-}sEwlpu&H^T_V0{#G?lZMxL7ih_&{(g)MWBnCZxtXg znr#}>U^6!jA%e}@Gj49LWG@*&t0V>Cxc3?oO7LSG%~)Y5}f7vqUUnQ;STjdDU}P9IF9d9<$;=QaXc zL1^X7>fa^jHBu_}9}J~#-oz3Oq^JmGR#?GO7b9a(=R@fw@}Q{{@`Wy1vIQ#Bw?>@X z-_RGG@wt|%u`XUc%W{J z>iSeiz8C3H7@St3mOr_mU+&bL#Uif;+Xw-aZdNYUpdf>Rvu0i0t6k*}vwU`XNO2he z%miH|1tQ8~ZK!zmL&wa3E;l?!!XzgV#%PMVU!0xrDsNNZUWKlbiOjzH-1Uoxm8E#r`#2Sz;-o&qcqB zC-O_R{QGuynW14@)7&@yw1U}uP(1cov)twxeLus0s|7ayrtT8c#`&2~Fiu2=R;1_4bCaD=*E@cYI>7YSnt)nQc zohw5CsK%m?8Ack)qNx`W0_v$5S}nO|(V|RZKBD+btO?JXe|~^Qqur%@eO~<8-L^9d z=GA3-V14ng9L29~XJ>a5k~xT2152zLhM*@zlp2P5Eu}bywkcqR;ISbas&#T#;HZSf z2m69qTV(V@EkY(1Dk3`}j)JMo%ZVJ*5eB zYOjIisi+igK0#yW*gBGj?@I{~mUOvRFQR^pJbEbzFxTubnrw(Muk%}jI+vXmJ;{Q6 zrSobKD>T%}jV4Ub?L1+MGOD~0Ir%-`iTnWZN^~YPrcP5y3VMAzQ+&en^VzKEb$K!Q z<7Dbg&DNXuow*eD5yMr+#08nF!;%4vGrJI++5HdCFcGLfMW!KS*Oi@=7hFwDG!h2< zPunUEAF+HncQkbfFj&pbzp|MU*~60Z(|Ik%Tn{BXMN!hZOosNIseT?R;A`W?=d?5X zK(FB=9mZusYahp|K-wyb={rOpdn=@;4YI2W0EcbMKyo~-#^?h`BA9~o285%oY zfifCh5Lk$SY@|2A@a!T2V+{^!psQkx4?x0HSV`(w9{l75QxMk!)U52Lbhn{8ol?S) zCKo*7R(z!uk<6*qO=wh!Pul{(qq6g6xW;X68GI_CXp`XwO zxuSgPRAtM8K7}5E#-GM!*ydOOG_{A{)hkCII<|2=ma*71ci_-}VPARm3crFQjLYV! z9zbz82$|l01mv`$WahE2$=fAGWkd^X2kY(J7iz}WGS z@%MyBEO=A?HB9=^?nX`@nh;7;laAjs+fbo!|K^mE!tOB>$2a_O0y-*uaIn8k^6Y zSbuv;5~##*4Y~+y7Z5O*3w4qgI5V^17u*ZeupVGH^nM&$qmAk|anf*>r zWc5CV;-JY-Z@Uq1Irpb^O`L_7AGiqd*YpGUShb==os$uN3yYvb`wm6d=?T*it&pDk zo`vhw)RZX|91^^Wa_ti2zBFyWy4cJu#g)_S6~jT}CC{DJ_kKpT`$oAL%b^!2M;JgT zM3ZNbUB?}kP(*YYvXDIH8^7LUxz5oE%kMhF!rnPqv!GiY0o}NR$OD=ITDo9r%4E>E0Y^R(rS^~XjWyVI6 zMOR5rPXhTp*G*M&X#NTL`Hu*R+u*QNoiOKg4CtNPrjgH>c?Hi4MUG#I917fx**+pJfOo!zFM&*da&G_x)L(`k&TPI*t3e^{crd zX<4I$5nBQ8Ax_lmNRa~E*zS-R0sxkz`|>7q_?*e%7bxqNm3_eRG#1ae3gtV9!fQpY z+!^a38o4ZGy9!J5sylDxZTx$JmG!wg7;>&5H1)>f4dXj;B+@6tMlL=)cLl={jLMxY zbbf1ax3S4>bwB9-$;SN2?+GULu;UA-35;VY*^9Blx)Jwyb$=U!D>HhB&=jSsd^6yw zL)?a|>GxU!W}ocTC(?-%z3!IUhw^uzc`Vz_g>-tv)(XA#JK^)ZnC|l1`@CdX1@|!| z_9gQ)7uOf?cR@KDp97*>6X|;t@Y`k_N@)aH7gY27)COv^P3ya9I{4z~vUjLR9~z1Z z5=G{mVtKH*&$*t0@}-i_v|3B$AHHYale7>E+jP`ClqG%L{u;*ff_h@)al?RuL7tOO z->;I}>%WI{;vbLP3VIQ^iA$4wl6@0sDj|~112Y4OFjMs`13!$JGkp%b&E8QzJw_L5 zOnw9joc0^;O%OpF$Qp)W1HI!$4BaXX84`%@#^dk^hFp^pQ@rx4g(8Xjy#!X%+X5Jd@fs3amGT`}mhq#L97R>OwT5-m|h#yT_-v@(k$q7P*9X~T*3)LTdzP!*B} z+SldbVWrrwQo9wX*%FyK+sRXTa@O?WM^FGWOE?S`R(0P{<6p#f?0NJvnBia?k^fX2 zNQs7K-?EijgHJY}&zsr;qJ<*PCZUd*x|dD=IQPUK_nn)@X4KWtqoJNHkT?ZWL_hF? zS8lp2(q>;RXR|F;1O}EE#}gCrY~#n^O`_I&?&z5~7N;zL0)3Tup`%)oHMK-^r$NT% zbFg|o?b9w(q@)6w5V%si<$!U<#}s#x@0aX-hP>zwS#9*75VXA4K*%gUc>+yzupTDBOKH8WR4V0pM(HrfbQ&eJ79>HdCvE=F z|J>s;;iDLB^3(9}?biKbxf1$lI!*Z%*0&8UUq}wMyPs_hclyQQi4;NUY+x2qy|0J; zhn8;5)4ED1oHwg+VZF|80<4MrL97tGGXc5Sw$wAI#|2*cvQ=jB5+{AjMiDHmhUC*a zlmiZ`LAuAn_}hftXh;`Kq0zblDk8?O-`tnilIh|;3lZp@F_osJUV9`*R29M?7H{Fy z`nfVEIDIWXmU&YW;NjU8)EJpXhxe5t+scf|VXM!^bBlwNh)~7|3?fWwo_~ZFk(22% zTMesYw+LNx3J-_|DM~`v93yXe=jPD{q;li;5PD?Dyk+b? zo21|XpT@)$BM$%F=P9J19Vi&1#{jM3!^Y&fr&_`toi`XB1!n>sbL%U9I5<7!@?t)~ z;&H%z>bAaQ4f$wIzkjH70;<8tpUoxzKrPhn#IQfS%9l5=Iu))^XC<58D!-O z{B+o5R^Z21H0T9JQ5gNJnqh#qH^na|z92=hONIM~@_iuOi|F>jBh-?aA20}Qx~EpDGElELNn~|7WRXRFnw+Wdo`|# zBpU=Cz3z%cUJ0mx_1($X<40XEIYz(`noWeO+x#yb_pwj6)R(__%@_Cf>txOQ74wSJ z0#F3(zWWaR-jMEY$7C*3HJrohc79>MCUu26mfYN)f4M~4gD`}EX4e}A!U}QV8!S47 z6y-U-%+h`1n`*pQuKE%Av0@)+wBZr9mH}@vH@i{v(m-6QK7Ncf17x_D=)32`FOjjo zg|^VPf5c6-!FxN{25dvVh#fog=NNpXz zfB$o+0jbRkHH{!TKhE709f+jI^$3#v1Nmf80w`@7-5$1Iv_`)W^px8P-({xwb;D0y z7LKDAHgX<84?l!I*Dvi2#D@oAE^J|g$3!)x1Ua;_;<@#l1fD}lqU2_tS^6Ht$1Wl} zBESo7o^)9-Tjuz$8YQSGhfs{BQV6zW7dA?0b(Dbt=UnQs&4zHfe_sj{RJ4uS-vQpC zX;Bbsuju4%!o8?&m4UZU@~ZZjeFF6ex2ss5_60_JS_|iNc+R0GIjH1@Z z=rLT9%B|WWgOrR7IiIwr2=T;Ne?30M!@{%Qf8o`!>=s<2CBpCK_TWc(DX51>e^xh8 z&@$^b6CgOd7KXQV&Y4%}_#uN*mbanXq(2=Nj`L7H7*k(6F8s6{FOw@(DzU`4-*77{ zF+dxpv}%mFpYK?>N_2*#Y?oB*qEKB}VoQ@bzm>ptmVS_EC(#}Lxxx730trt0G)#$b zE=wVvtqOct1%*9}U{q<)2?{+0TzZzP0jgf9*)arV)*e!f`|jgT{7_9iS@e)recI#z zbzolURQ+TOzE!ymqvBY7+5NnAbWxvMLsLTwEbFqW=CPyCsmJ}P1^V30|D5E|p3BC5 z)3|qgw@ra7aXb-wsa|l^in~1_fm{7bS9jhVRkYVO#U{qMp z)Wce+|DJ}4<2gp8r0_xfZpMo#{Hl2MfjLcZdRB9(B(A(f;+4s*FxV{1F|4d`*sRNd zp4#@sEY|?^FIJ;tmH{@keZ$P(sLh5IdOk@k^0uB^BWr@pk6mHy$qf&~rI>P*a;h0C{%oA*i!VjWn&D~O#MxN&f@1Po# zKN+ zrGrkSjcr?^R#nGl<#Q722^wbYcgW@{+6CBS<1@%dPA8HC!~a`jTz<`g_l5N1M@9wn9GOAZ>nqNgq!yOCbZ@1z`U_N`Z>}+1HIZxk*5RDc&rd5{3qjRh8QmT$VyS;jK z;AF+r6XnnCp=wQYoG|rT2@8&IvKq*IB_WvS%nt%e{MCFm`&W*#LXc|HrD?nVBo=(8*=Aq?u$sDA_sC_RPDUiQ+wnIJET8vx$&fxkW~kP9qXKt zozR)@xGC!P)CTkjeWvXW5&@2?)qt)jiYWWBU?AUtzAN}{JE1I)dfz~7$;}~BmQF`k zpn11qmObXwRB8&rnEG*#4Xax3XBkKlw(;tb?Np^i+H8m(Wyz9k{~ogba@laiEk;2! zV*QV^6g6(QG%vX5Um#^sT&_e`B1pBW5yVth~xUs#0}nv?~C#l?W+9Lsb_5)!71rirGvY zTIJ$OPOY516Y|_014sNv+Z8cc5t_V=i>lWV=vNu#!58y9Zl&GsMEW#pPYPYGHQ|;vFvd*9eM==$_=vc7xnyz0~ zY}r??$<`wAO?JQk@?RGvkWVJlq2dk9vB(yV^vm{=NVI8dhsX<)O(#nr9YD?I?(VmQ z^r7VfUBn<~p3()8yOBjm$#KWx!5hRW)5Jl7wY@ky9lNM^jaT##8QGVsYeaVywmpv>X|Xj7gWE1Ezai&wVLt3p)k4w~yrskT-!PR!kiyQlaxl(( zXhF%Q9x}1TMt3~u@|#wWm-Vq?ZerK={8@~&@9r5JW}r#45#rWii};t`{5#&3$W)|@ zbAf2yDNe0q}NEUvq_Quq3cTjcw z@H_;$hu&xllCI9CFDLuScEMg|x{S7GdV8<&Mq=ezDnRZAyX-8gv97YTm0bg=d)(>N z+B2FcqvI9>jGtnK%eO%y zoBPkJTk%y`8TLf4)IXPBn`U|9>O~WL2C~C$z~9|0m*YH<-vg2CD^SX#&)B4ngOSG$ zV^wmy_iQk>dfN@Pv(ckfy&#ak@MLC7&Q6Ro#!ezM*VEh`+b3Jt%m(^T&p&WJ2Oqvj zs-4nq0TW6cv~(YI$n0UkfwN}kg3_fp?(ijSV#tR9L0}l2qjc7W?i*q01=St0eZ=4h zyGQbEw`9OEH>NMuIe)hVwYHsGERWOD;JxEiO7cQv%pFCeR+IyhwQ|y@&^24k+|8fD zLiOWFNJ2&vu2&`Jv96_z-Cd5RLgmeY3*4rDOQo?Jm`;I_(+ejsPM03!ly!*Cu}Cco zrQSrEDHNyzT(D5s1rZq!8#?f6@v6dB7a-aWs(Qk>N?UGAo{gytlh$%_IhyL7h?DLXDGx zgxGEBQoCAWo-$LRvM=F5MTle`M})t3vVv;2j0HZY&G z22^iGhV@uaJh(XyyY%} zd4iH_UfdV#T=3n}(Lj^|n;O4|$;xhu*8T3hR1mc_A}fK}jfZ7LX~*n5+`8N2q#rI$ z@<_2VANlYF$vIH$ zl<)+*tIWW78IIINA7Rr7i{<;#^yzxoLNkXL)eSs=%|P>$YQIh+ea_3k z_s7r4%j7%&*NHSl?R4k%1>Z=M9o#zxY!n8sL5>BO-ZP;T3Gut>iLS@U%IBrX6BA3k z)&@q}V8a{X<5B}K5s(c(LQ=%v1ocr`t$EqqY0EqVjr65usa=0bkf|O#ky{j3)WBR(((L^wmyHRzoWuL2~WTC=`yZ zn%VX`L=|Ok0v7?s>IHg?yArBcync5rG#^+u)>a%qjES%dRZoIyA8gQ;StH z1Ao7{<&}6U=5}4v<)1T7t!J_CL%U}CKNs-0xWoTTeqj{5{?Be$L0_tk>M9o8 zo371}S#30rKZFM{`H_(L`EM9DGp+Mifk&IP|C2Zu_)Ghr4Qtpmkm1osCf@%Z$%t+7 zYH$Cr)Ro@3-QDeQJ8m+x6%;?YYT;k6Z0E-?kr>x33`H%*ueBD7Zx~3&HtWn0?2Wt} zTG}*|v?{$ajzt}xPzV%lL1t-URi8*Zn)YljXNGDb>;!905Td|mpa@mHjIH%VIiGx- zd@MqhpYFu4_?y5N4xiHn3vX&|e6r~Xt> zZG`aGq|yTNjv;9E+Txuoa@A(9V7g?1_T5FzRI;!=NP1Kqou1z5?%X~Wwb{trRfd>i z8&y^H)8YnKyA_Fyx>}RNmQIczT?w2J4SNvI{5J&}Wto|8FR(W;Qw#b1G<1%#tmYzQ zQ2mZA-PAdi%RQOhkHy9Ea#TPSw?WxwL@H@cbkZwIq0B!@ns}niALidmn&W?!Vd4Gj zO7FiuV4*6Mr^2xlFSvM;Cp_#r8UaqIzHJQg_z^rEJw&OMm_8NGAY2)rKvki|o1bH~ z$2IbfVeY2L(^*rMRU1lM5Y_sgrDS`Z??nR2lX;zyR=c%UyGb*%TC-Dil?SihkjrQy~TMv6;BMs7P8il`H7DmpVm@rJ;b)hW)BL)GjS154b*xq-NXq2cwE z^;VP7ua2pxvCmxrnqUYQMH%a%nHmwmI33nJM(>4LznvY*k&C0{8f*%?zggpDgkuz&JBx{9mfb@wegEl2v!=}Sq2Gaty0<)UrOT0{MZtZ~j5y&w zXlYa_jY)I_+VA-^#mEox#+G>UgvM!Ac8zI<%JRXM_73Q!#i3O|)lOP*qBeJG#BST0 zqohi)O!|$|2SeJQo(w6w7%*92S})XfnhrH_Z8qe!G5>CglP=nI7JAOW?(Z29;pXJ9 zR9`KzQ=WEhy*)WH>$;7Cdz|>*i>=##0bB)oU0OR>>N<21e4rMCHDemNi2LD>Nc$;& zQRFthpWniC1J6@Zh~iJCoLOxN`oCKD5Q4r%ynwgUKPlIEd#?QViIqovY|czyK8>6B zSP%{2-<;%;1`#0mG^B(8KbtXF;Nf>K#Di72UWE4gQ%(_26Koiad)q$xRL~?pN71ZZ zujaaCx~jXjygw;rI!WB=xrOJO6HJ!!w}7eiivtCg5K|F6$EXa)=xUC za^JXSX98W`7g-tm@uo|BKj39Dl;sg5ta;4qjo^pCh~{-HdLl6qI9Ix6f$+qiZ$}s= zNguKrU;u+T@ko(Vr1>)Q%h$?UKXCY>3se%&;h2osl2D zE4A9bd7_|^njDd)6cI*FupHpE3){4NQ*$k*cOWZ_?CZ>Z4_fl@n(mMnYK62Q1d@+I zr&O))G4hMihgBqRIAJkLdk(p(D~X{-oBUA+If@B}j& zsHbeJ3RzTq96lB7d($h$xTeZ^gP0c{t!Y0c)aQE;$FY2!mACg!GDEMKXFOPI^)nHZ z`aSPJpvV0|bbrzhWWkuPURlDeN%VT8tndV8?d)eN*i4I@u zVKl^6{?}A?P)Fsy?3oi#clf}L18t;TjNI2>eI&(ezDK7RyqFxcv%>?oxUlonv(px) z$vnPzRH`y5A(x!yOIfL0bmgeMQB$H5wenx~!ujQK*nUBW;@Em&6Xv2%s(~H5WcU2R z;%Nw<$tI)a`Ve!>x+qegJnQsN2N7HaKzrFqM>`6R*gvh%O*-%THt zrB$Nk;lE;z{s{r^PPm5qz(&lM{sO*g+W{sK+m3M_z=4=&CC>T`{X}1Vg2PEfSj2x_ zmT*(x;ov%3F?qoEeeM>dUn$a*?SIGyO8m806J1W1o+4HRhc2`9$s6hM#qAm zChQ87b~GEw{ADfs+5}FJ8+|bIlIv(jT$Ap#hSHoXdd9#w<#cA<1Rkq^*EEkknUd4& zoIWIY)sAswy6fSERVm&!SO~#iN$OgOX*{9@_BWFyJTvC%S++ilSfCrO(?u=Dc?CXZ zzCG&0yVR{Z`|ZF0eEApWEo#s9osV>F{uK{QA@BES#&;#KsScf>y zvs?vIbI>VrT<*!;XmQS=bhq%46-aambZ(8KU-wOO2=en~D}MCToB_u;Yz{)1ySrPZ z@=$}EvjTdzTWU7c0ZI6L8=yP+YRD_eMMos}b5vY^S*~VZysrkq<`cK3>>v%uy7jgq z0ilW9KjVDHLv0b<1K_`1IkbTOINs0=m-22c%M~l=^S}%hbli-3?BnNq?b`hx^HX2J zIe6ECljRL0uBWb`%{EA=%!i^4sMcj+U_TaTZRb+~GOk z^ZW!nky0n*Wb*r+Q|9H@ml@Z5gU&W`(z4-j!OzC1wOke`TRAYGZVl$PmQ16{3196( zO*?`--I}Qf(2HIwb2&1FB^!faPA2=sLg(@6P4mN)>Dc3i(B0;@O-y2;lM4akD>@^v z=u>*|!s&9zem70g7zfw9FXl1bpJW(C#5w#uy5!V?Q(U35A~$dR%LDVnq@}kQm13{} zd53q3N(s$Eu{R}k2esbftfjfOITCL;jWa$}(mmm}d(&7JZ6d3%IABCapFFYjdEjdK z&4Edqf$G^MNAtL=uCDRs&Fu@FXRgX{*0<(@c3|PNHa>L%zvxWS={L8%qw`STm+=Rd zA}FLspESSIpE_^41~#5yI2bJ=9`oc;GIL!JuW&7YetZ?0H}$$%8rW@*J37L-~Rsx!)8($nI4 zZhcZ2^=Y+p4YPl%j!nFJA|*M^gc(0o$i3nlphe+~-_m}jVkRN{spFs(o0ajW@f3K{ zDV!#BwL322CET$}Y}^0ixYj2w>&Xh12|R8&yEw|wLDvF!lZ#dOTHM9pK6@Nm-@9Lnng4ZHBgBSrr7KI8YCC9DX5Kg|`HsiwJHg2(7#nS;A{b3tVO?Z% za{m5b3rFV6EpX;=;n#wltDv1LE*|g5pQ+OY&*6qCJZc5oDS6Z6JD#6F)bWxZSF@q% z+1WV;m!lRB!n^PC>RgQCI#D1br_o^#iPk>;K2hB~0^<~)?p}LG%kigm@moD#q3PE+ zA^Qca)(xnqw6x>XFhV6ku9r$E>bWNrVH9fum0?4s?Rn2LG{Vm_+QJHse6xa%nzQ?k zKug4PW~#Gtb;#5+9!QBgyB@q=sk9=$S{4T>wjFICStOM?__fr+Kei1 z3j~xPqW;W@YkiUM;HngG!;>@AITg}vAE`M2Pj9Irl4w1fo4w<|Bu!%rh%a(Ai^Zhi zs92>v5;@Y(Zi#RI*ua*h`d_7;byQSa*v9E{2x$<-_=5Z<7{%)}4XExANcz@rK69T0x3%H<@frW>RA8^swA+^a(FxK| zFl3LD*ImHN=XDUkrRhp6RY5$rQ{bRgSO*(vEHYV)3Mo6Jy3puiLmU&g82p{qr0F?ohmbz)f2r{X2|T2 z$4fdQ=>0BeKbiVM!e-lIIs8wVTuC_m7}y4A_%ikI;Wm5$9j(^Y z(cD%U%k)X>_>9~t8;pGzL6L-fmQO@K; zo&vQzMlgY95;1BSkngY)e{`n0!NfVgf}2mB3t}D9@*N;FQ{HZ3Pb%BK6;5#-O|WI( zb6h@qTLU~AbVW#_6?c!?Dj65Now7*pU{h!1+eCV^KCuPAGs28~3k@ueL5+u|Z-7}t z9|lskE`4B7W8wMs@xJa{#bsCGDFoRSNSnmNYB&U7 zVGKWe%+kFB6kb)e;TyHfqtU6~fRg)f|>=5(N36)0+C z`hv65J<$B}WUc!wFAb^QtY31yNleq4dzmG`1wHTj=c*=hay9iD071Hc?oYoUk|M*_ zU1GihAMBsM@5rUJ(qS?9ZYJ6@{bNqJ`2Mr+5#hKf?doa?F|+^IR!8lq9)wS3tF_9n zW_?hm)G(M+MYb?V9YoX^_mu5h-LP^TL^!Q9Z7|@sO(rg_4+@=PdI)WL(B7`!K^ND- z-uIuVDCVEdH_C@c71YGYT^_Scf_dhB8Z2Xy6vGtBSlYud9vggOqv^L~F{BraSE_t} zIkP+Hp2&nH^-MNEs}^`oMLy11`PQW$T|K(`Bu*(f@)mv1-qY(_YG&J2M2<7k;;RK~ zL{Fqj9yCz8(S{}@c)S!65aF<=&eLI{hAMErCx&>i7OeDN>okvegO87OaG{Jmi<|}D zaT@b|0X{d@OIJ7zvT>r+eTzgLq~|Dpu)Z&db-P4z*`M$UL51lf>FLlq6rfG)%doyp z)3kk_YIM!03eQ8Vu_2fg{+osaEJPtJ-s36R+5_AEG12`NG)IQ#TF9c@$99%0iye+ zUzZ57=m2)$D(5Nx!n)=5Au&O0BBgwxIBaeI(mro$#&UGCr<;C{UjJVAbVi%|+WP(a zL$U@TYCxJ=1{Z~}rnW;7UVb7+ZnzgmrogDxhjLGo>c~MiJAWs&&;AGg@%U?Y^0JhL ze(x6Z74JG6FlOFK(T}SXQfhr}RIFl@QXKnIcXYF)5|V~e-}suHILKT-k|<*~Ij|VF zC;t@=uj=hot~*!C68G8hTA%8SzOfETOXQ|3FSaIEjvBJp(A)7SWUi5!Eu#yWgY+;n zlm<$+UDou*V+246_o#V4kMdto8hF%%Lki#zPh}KYXmMf?hrN0;>Mv%`@{0Qn`Ujp) z=lZe+13>^Q!9zT);H<(#bIeRWz%#*}sgUX9P|9($kexOyKIOc`dLux}c$7It4u|Rl z6SSkY*V~g_B-hMPo_ak>>z@AVQ(_N)VY2kB3IZ0G(iDUYw+2d7W^~(Jq}KY=JnWS( z#rzEa&0uNhJ>QE8iiyz;n2H|SV#Og+wEZv=f2%1ELX!SX-(d3tEj$5$1}70Mp<&eI zCkfbByL7af=qQE@5vDVxx1}FSGt_a1DoE3SDI+G)mBAna)KBG4p8Epxl9QZ4BfdAN zFnF|Y(umr;gRgG6NLQ$?ZWgllEeeq~z^ZS7L?<(~O&$5|y)Al^iMKy}&W+eMm1W z7EMU)u^ke(A1#XCV>CZ71}P}0x)4wtHO8#JRG3MA-6g=`ZM!FcICCZ{IEw8Dm2&LQ z1|r)BUG^0GzI6f946RrBlfB1Vs)~8toZf~7)+G;pv&XiUO(%5bm)pl=p>nV^o*;&T z;}@oZSibzto$arQgfkp|z4Z($P>dTXE{4O=vY0!)kDO* zGF8a4wq#VaFpLfK!iELy@?-SeRrdz%F*}hjKcA*y@mj~VD3!it9lhRhX}5YOaR9$} z3mS%$2Be7{l(+MVx3 z(4?h;P!jnRmX9J9sYN#7i=iyj_5q7n#X(!cdqI2lnr8T$IfOW<_v`eB!d9xY1P=2q&WtOXY=D9QYteP)De?S4}FK6#6Ma z=E*V+#s8>L;8aVroK^6iKo=MH{4yEZ_>N-N z`(|;aOATba1^asjxlILk<4}f~`39dBFlxj>Dw(hMYKPO3EEt1@S`1lxFNM+J@uB7T zZ8WKjz7HF1-5&2=l=fqF-*@>n5J}jIxdDwpT?oKM3s8Nr`x8JnN-kCE?~aM1H!hAE z%%w(3kHfGwMnMmNj(SU(w42OrC-euI>Dsjk&jz3ts}WHqmMpzQ3vZrsXrZ|}+MHA7 z068obeXZTsO*6RS@o3x80E4ok``rV^Y3hr&C1;|ZZ0|*EKO`$lECUYG2gVFtUTw)R z4Um<0ZzlON`zTdvVdL#KFoMFQX*a5wM0Czp%wTtfK4Sjs)P**RW&?lP$(<}q%r68Z zS53Y!d@&~ne9O)A^tNrXHhXBkj~$8j%pT1%%mypa9AW5E&s9)rjF4@O3ytH{0z6riz|@< zB~UPh*wRFg2^7EbQrHf0y?E~dHlkOxof_a?M{LqQ^C!i2dawHTPYUE=X@2(3<=OOxs8qn_(y>pU>u^}3y&df{JarR0@VJn0f+U%UiF=$Wyq zQvnVHESil@d|8&R<%}uidGh7@u^(%?$#|&J$pvFC-n8&A>utA=n3#)yMkz+qnG3wd zP7xCnF|$9Dif@N~L)Vde3hW8W!UY0BgT2v(wzp;tlLmyk2%N|0jfG$%<;A&IVrOI< z!L)o>j>;dFaqA3pL}b-Je(bB@VJ4%!JeX@3x!i{yIeIso^=n?fDX`3bU=eG7sTc%g%ye8$v8P@yKE^XD=NYxTb zbf!Mk=h|otpqjFaA-vs5YOF-*GwWPc7VbaOW&stlANnCN8iftFMMrUdYNJ_Bnn5Vt zxfz@Ah|+4&P;reZxp;MmEI7C|FOv8NKUm8njF7Wb6Gi7DeODLl&G~}G4be&*Hi0Qw z5}77vL0P+7-B%UL@3n1&JPxW^d@vVwp?u#gVcJqY9#@-3X{ok#UfW3<1fb%FT`|)V~ggq z(3AUoUS-;7)^hCjdT0Kf{i}h)mBg4qhtHHBti=~h^n^OTH5U*XMgDLIR@sre`AaB$ zg)IGBET_4??m@cx&c~bA80O7B8CHR7(LX7%HThkeC*@vi{-pL%e)yXp!B2InafbDF zjPXf1mko3h59{lT6EEbxKO1Z5GF71)WwowO6kY|6tjSVSWdQ}NsK2x{>i|MKZK8%Q zfu&_0D;CO-Jg0#YmyfctyJ!mRJp)e#@O0mYdp|8x;G1%OZQ3Q847YWTyy|%^cpA;m zze0(5p{tMu^lDkpe?HynyO?a1$_LJl2L&mpeKu%8YvgRNr=%2z${%WThHG=vrWY@4 zsA`OP#O&)TetZ>s%h!=+CE15lOOls&nvC~$Qz0Ph7tHiP;O$i|eDwpT{cp>+)0-|; zY$|bB+Gbel>5aRN3>c0x)4U=|X+z+{ zn*_p*EQoquRL+=+p;=lm`d71&1NqBz&_ph)MXu(Nv6&XE7(RsS)^MGj5Q?Fwude-(sq zjJ>aOq!7!EN>@(fK7EE#;i_BGvli`5U;r!YA{JRodLBc6-`n8K+Fjgwb%sX;j=qHQ z7&Tr!)!{HXoO<2BQrV9Sw?JRaLXV8HrsNevvnf>Y-6|{T!pYLl7jp$-nEE z#X!4G4L#K0qG_4Z;Cj6=;b|Be$hi4JvMH!-voxqx^@8cXp`B??eFBz2lLD8RRaRGh zn7kUfy!YV~p(R|p7iC1Rdgt$_24i0cd-S8HpG|`@my70g^y`gu%#Tf_L21-k?sRRZHK&at(*ED0P8iw{7?R$9~OF$Ko;Iu5)ur5<->x!m93Eb zFYpIx60s=Wxxw=`$aS-O&dCO_9?b1yKiPCQmSQb>T)963`*U+Ydj5kI(B(B?HNP8r z*bfSBpSu)w(Z3j7HQoRjUG(+d=IaE~tv}y14zHHs|0UcN52fT8V_<@2ep_ee{QgZG zmgp8iv4V{k;~8@I%M3<#B;2R>Ef(Gg_cQM7%}0s*^)SK6!Ym+~P^58*wnwV1BW@eG z4sZLqsUvBbFsr#8u7S1r4teQ;t)Y@jnn_m5jS$CsW1um!p&PqAcc8!zyiXHVta9QC zY~wCwCF0U%xiQPD_INKtTb;A|Zf29(mu9NI;E zc-e>*1%(LSXB`g}kd`#}O;veb<(sk~RWL|f3ljxCnEZDdNSTDV6#Td({6l&y4IjKF z^}lIUq*ZUqgTPumD)RrCN{M^jhY>E~1pn|KOZ5((%F)G|*ZQ|r4zIbrEiV%42hJV8 z3xS)=!X1+=olbdGJ=yZil?oXLct8FM{(6ikLL3E%=q#O6(H$p~gQu6T8N!plf!96| z&Q3=`L~>U0zZh;z(pGR2^S^{#PrPxTRHD1RQOON&f)Siaf`GLj#UOk&(|@0?zm;Sx ztsGt8=29-MZs5CSf1l1jNFtNt5rFNZxJPvkNu~2}7*9468TWm>nN9TP&^!;J{-h)_ z7WsHH9|F%I`Pb!>KAS3jQWKfGivTVkMJLO-HUGM_a4UQ_%RgL6WZvrW+Z4ujZn;y@ zz9$=oO!7qVTaQAA^BhX&ZxS*|5dj803M=k&2%QrXda`-Q#IoZL6E(g+tN!6CA!CP* zCpWtCujIea)ENl0liwVfj)Nc<9mV%+e@=d`haoZ*`B7+PNjEbXBkv=B+Pi^~L#EO$D$ZqTiD8f<5$eyb54-(=3 zh)6i8i|jp(@OnRrY5B8t|LFXFQVQ895n*P16cEKTrT*~yLH6Z4e*bZ5otpRDri&+A zfNbK1D5@O=sm`fN=WzWyse!za5n%^+6dHPGX#8DyIK>?9qyX}2XvBWVqbP%%D)7$= z=#$WulZlZR<{m#gU7lwqK4WS1Ne$#_P{b17qe$~UOXCl>5b|6WVh;5vVnR<%d+Lnp z$uEmML38}U4vaW8>shm6CzB(Wei3s#NAWE3)a2)z@i{4jTn;;aQS)O@l{rUM`J@K& l00vQ5JBs~;vo!vr%%-k{2_Fq1Mn4QF81S)AQ99zk{{c4yR+0b! literal 63721 zcmb5Wb9gP!wgnp7wrv|bwr$&XvSZt}Z6`anZSUAlc9NHKf9JdJ;NJVr`=eI(_pMp0 zy1VAAG3FfAOI`{X1O)&90s;U4K;XLp008~hCjbEC_fbYfS%6kTR+JtXK>nW$ZR+`W ze|#J8f4A@M|F5BpfUJb5h>|j$jOe}0oE!`Zf6fM>CR?!y@zU(cL8NsKk`a z6tx5mAkdjD;J=LcJ;;Aw8p!v#ouk>mUDZF@ zK>yvw%+bKu+T{Nk@LZ;zkYy0HBKw06_IWcMHo*0HKpTsEFZhn5qCHH9j z)|XpN&{`!0a>Vl+PmdQc)Yg4A(AG-z!+@Q#eHr&g<9D?7E)_aEB?s_rx>UE9TUq|? z;(ggJt>9l?C|zoO@5)tu?EV0x_7T17q4fF-q3{yZ^ipUbKcRZ4Qftd!xO(#UGhb2y>?*@{xq%`(-`2T^vc=#< zx!+@4pRdk&*1ht2OWk^Z5IAQ0YTAXLkL{(D*$gENaD)7A%^XXrCchN&z2x+*>o2FwPFjWpeaL=!tzv#JOW#( z$B)Nel<+$bkH1KZv3&-}=SiG~w2sbDbAWarg%5>YbC|}*d9hBjBkR(@tyM0T)FO$# zPtRXukGPnOd)~z=?avu+4Co@wF}1T)-uh5jI<1$HLtyDrVak{gw`mcH@Q-@wg{v^c zRzu}hMKFHV<8w}o*yg6p@Sq%=gkd~;`_VGTS?L@yVu`xuGy+dH6YOwcP6ZE`_0rK% zAx5!FjDuss`FQ3eF|mhrWkjux(Pny^k$u_)dyCSEbAsecHsq#8B3n3kDU(zW5yE|( zgc>sFQywFj5}U*qtF9Y(bi*;>B7WJykcAXF86@)z|0-Vm@jt!EPoLA6>r)?@DIobIZ5Sx zsc@OC{b|3%vaMbyeM|O^UxEYlEMHK4r)V-{r)_yz`w1*xV0|lh-LQOP`OP`Pk1aW( z8DSlGN>Ts|n*xj+%If~+E_BxK)~5T#w6Q1WEKt{!Xtbd`J;`2a>8boRo;7u2M&iOop4qcy<)z023=oghSFV zST;?S;ye+dRQe>ygiJ6HCv4;~3DHtJ({fWeE~$H@mKn@Oh6Z(_sO>01JwH5oA4nvK zr5Sr^g+LC zLt(i&ecdmqsIJGNOSUyUpglvhhrY8lGkzO=0USEKNL%8zHshS>Qziu|`eyWP^5xL4 zRP122_dCJl>hZc~?58w~>`P_s18VoU|7(|Eit0-lZRgLTZKNq5{k zE?V=`7=R&ro(X%LTS*f+#H-mGo_j3dm@F_krAYegDLk6UV{`UKE;{YSsn$ z(yz{v1@p|p!0>g04!eRSrSVb>MQYPr8_MA|MpoGzqyd*$@4j|)cD_%^Hrd>SorF>@ zBX+V<@vEB5PRLGR(uP9&U&5=(HVc?6B58NJT_igiAH*q~Wb`dDZpJSKfy5#Aag4IX zj~uv74EQ_Q_1qaXWI!7Vf@ZrdUhZFE;L&P_Xr8l@GMkhc#=plV0+g(ki>+7fO%?Jb zl+bTy7q{w^pTb{>(Xf2q1BVdq?#f=!geqssXp z4pMu*q;iiHmA*IjOj4`4S&|8@gSw*^{|PT}Aw~}ZXU`6=vZB=GGeMm}V6W46|pU&58~P+?LUs%n@J}CSrICkeng6YJ^M? zS(W?K4nOtoBe4tvBXs@@`i?4G$S2W&;$z8VBSM;Mn9 zxcaEiQ9=vS|bIJ>*tf9AH~m&U%2+Dim<)E=}KORp+cZ^!@wI`h1NVBXu{@%hB2Cq(dXx_aQ9x3mr*fwL5!ZryQqi|KFJuzvP zK1)nrKZ7U+B{1ZmJub?4)Ln^J6k!i0t~VO#=q1{?T)%OV?MN}k5M{}vjyZu#M0_*u z8jwZKJ#Df~1jcLXZL7bnCEhB6IzQZ-GcoQJ!16I*39iazoVGugcKA{lhiHg4Ta2fD zk1Utyc5%QzZ$s3;p0N+N8VX{sd!~l*Ta3|t>lhI&G`sr6L~G5Lul`>m z{!^INm?J|&7X=;{XveF!(b*=?9NAp4y&r&N3(GKcW4rS(Ejk|Lzs1PrxPI_owB-`H zg3(Rruh^&)`TKA6+_!n>RdI6pw>Vt1_j&+bKIaMTYLiqhZ#y_=J8`TK{Jd<7l9&sY z^^`hmi7^14s16B6)1O;vJWOF$=$B5ONW;;2&|pUvJlmeUS&F;DbSHCrEb0QBDR|my zIs+pE0Y^`qJTyH-_mP=)Y+u^LHcuZhsM3+P||?+W#V!_6E-8boP#R-*na4!o-Q1 zVthtYhK{mDhF(&7Okzo9dTi03X(AE{8cH$JIg%MEQca`S zy@8{Fjft~~BdzWC(di#X{ny;!yYGK9b@=b|zcKZ{vv4D8i+`ilOPl;PJl{!&5-0!w z^fOl#|}vVg%=n)@_e1BrP)`A zKPgs`O0EO}Y2KWLuo`iGaKu1k#YR6BMySxQf2V++Wo{6EHmK>A~Q5o73yM z-RbxC7Qdh0Cz!nG+7BRZE>~FLI-?&W_rJUl-8FDIaXoNBL)@1hwKa^wOr1($*5h~T zF;%f^%<$p8Y_yu(JEg=c_O!aZ#)Gjh$n(hfJAp$C2he555W5zdrBqjFmo|VY+el;o z=*D_w|GXG|p0**hQ7~9-n|y5k%B}TAF0iarDM!q-jYbR^us(>&y;n^2l0C%@2B}KM zyeRT9)oMt97Agvc4sEKUEy%MpXr2vz*lb zh*L}}iG>-pqDRw7ud{=FvTD?}xjD)w{`KzjNom-$jS^;iw0+7nXSnt1R@G|VqoRhE%12nm+PH?9`(4rM0kfrZzIK9JU=^$YNyLvAIoxl#Q)xxDz!^0@zZ zSCs$nfcxK_vRYM34O<1}QHZ|hp4`ioX3x8(UV(FU$J@o%tw3t4k1QPmlEpZa2IujG&(roX_q*%e`Hq|);0;@k z0z=fZiFckp#JzW0p+2A+D$PC~IsakhJJkG(c;CqAgFfU0Z`u$PzG~-9I1oPHrCw&)@s^Dc~^)#HPW0Ra}J^=|h7Fs*<8|b13ZzG6MP*Q1dkoZ6&A^!}|hbjM{2HpqlSXv_UUg1U4gn z3Q)2VjU^ti1myodv+tjhSZp%D978m~p& z43uZUrraHs80Mq&vcetqfQpQP?m!CFj)44t8Z}k`E798wxg&~aCm+DBoI+nKq}&j^ zlPY3W$)K;KtEajks1`G?-@me7C>{PiiBu+41#yU_c(dITaqE?IQ(DBu+c^Ux!>pCj zLC|HJGU*v+!it1(;3e`6igkH(VA)-S+k(*yqxMgUah3$@C zz`7hEM47xr>j8^g`%*f=6S5n>z%Bt_Fg{Tvmr+MIsCx=0gsu_sF`q2hlkEmisz#Fy zj_0;zUWr;Gz}$BS%Y`meb(=$d%@Crs(OoJ|}m#<7=-A~PQbyN$x%2iXP2@e*nO0b7AwfH8cCUa*Wfu@b)D_>I*%uE4O3 z(lfnB`-Xf*LfC)E}e?%X2kK7DItK6Tf<+M^mX0Ijf_!IP>7c8IZX%8_#0060P{QMuV^B9i<^E`_Qf0pv9(P%_s8D`qvDE9LK9u-jB}J2S`(mCO&XHTS04Z5Ez*vl^T%!^$~EH8M-UdwhegL>3IQ*)(MtuH2Xt1p!fS4o~*rR?WLxlA!sjc2(O znjJn~wQ!Fp9s2e^IWP1C<4%sFF}T4omr}7+4asciyo3DntTgWIzhQpQirM$9{EbQd z3jz9vS@{aOqTQHI|l#aUV@2Q^Wko4T0T04Me4!2nsdrA8QY1%fnAYb~d2GDz@lAtfcHq(P7 zaMBAGo}+NcE-K*@9y;Vt3*(aCaMKXBB*BJcD_Qnxpt75r?GeAQ}*|>pYJE=uZb73 zC>sv)18)q#EGrTG6io*}JLuB_jP3AU1Uiu$D7r|2_zlIGb9 zjhst#ni)Y`$)!fc#reM*$~iaYoz~_Cy7J3ZTiPm)E?%`fbk`3Tu-F#`{i!l5pNEn5 zO-Tw-=TojYhzT{J=?SZj=Z8#|eoF>434b-DXiUsignxXNaR3 zm_}4iWU$gt2Mw5NvZ5(VpF`?X*f2UZDs1TEa1oZCif?Jdgr{>O~7}-$|BZ7I(IKW`{f;@|IZFX*R8&iT= zoWstN8&R;}@2Ka%d3vrLtR|O??ben;k8QbS-WB0VgiCz;<$pBmIZdN!aalyCSEm)crpS9dcD^Y@XT1a3+zpi-`D}e#HV<} z$Y(G&o~PvL-xSVD5D?JqF3?B9rxGWeb=oEGJ3vRp5xfBPlngh1O$yI95EL+T8{GC@ z98i1H9KhZGFl|;`)_=QpM6H?eDPpw~^(aFQWwyXZ8_EEE4#@QeT_URray*mEOGsGc z6|sdXtq!hVZo=d#+9^@lm&L5|q&-GDCyUx#YQiccq;spOBe3V+VKdjJA=IL=Zn%P} zNk=_8u}VhzFf{UYZV0`lUwcD&)9AFx0@Fc6LD9A6Rd1=ga>Mi0)_QxM2ddCVRmZ0d z+J=uXc(?5JLX3=)e)Jm$HS2yF`44IKhwRnm2*669_J=2LlwuF5$1tAo@ROSU@-y+;Foy2IEl2^V1N;fk~YR z?&EP8#t&m0B=?aJeuz~lHjAzRBX>&x=A;gIvb>MD{XEV zV%l-+9N-)i;YH%nKP?>f`=?#`>B(`*t`aiPLoQM(a6(qs4p5KFjDBN?8JGrf3z8>= zi7sD)c)Nm~x{e<^jy4nTx${P~cwz_*a>%0_;ULou3kHCAD7EYkw@l$8TN#LO9jC( z1BeFW`k+bu5e8Ns^a8dPcjEVHM;r6UX+cN=Uy7HU)j-myRU0wHd$A1fNI~`4;I~`zC)3ul#8#^rXVSO*m}Ag>c%_;nj=Nv$rCZ z*~L@C@OZg%Q^m)lc-kcX&a*a5`y&DaRxh6O*dfhLfF+fU5wKs(1v*!TkZidw*)YBP za@r`3+^IHRFeO%!ai%rxy;R;;V^Fr=OJlpBX;(b*3+SIw}7= zIq$*Thr(Zft-RlY)D3e8V;BmD&HOfX+E$H#Y@B3?UL5L~_fA-@*IB-!gItK7PIgG9 zgWuGZK_nuZjHVT_Fv(XxtU%)58;W39vzTI2n&)&4Dmq7&JX6G>XFaAR{7_3QB6zsT z?$L8c*WdN~nZGiscY%5KljQARN;`w$gho=p006z;n(qIQ*Zu<``TMO3n0{ARL@gYh zoRwS*|Niw~cR!?hE{m*y@F`1)vx-JRfqET=dJ5_(076st(=lFfjtKHoYg`k3oNmo_ zNbQEw8&sO5jAYmkD|Zaz_yUb0rC})U!rCHOl}JhbYIDLzLvrZVw0~JO`d*6f;X&?V=#T@ND*cv^I;`sFeq4 z##H5;gpZTb^0Hz@3C*~u0AqqNZ-r%rN3KD~%Gw`0XsIq$(^MEb<~H(2*5G^<2(*aI z%7}WB+TRlMIrEK#s0 z93xn*Ohb=kWFc)BNHG4I(~RPn-R8#0lqyBBz5OM6o5|>x9LK@%HaM}}Y5goCQRt2C z{j*2TtT4ne!Z}vh89mjwiSXG=%DURar~=kGNNaO_+Nkb+tRi~Rkf!7a$*QlavziD( z83s4GmQ^Wf*0Bd04f#0HX@ua_d8 z23~z*53ePD6@xwZ(vdl0DLc=>cPIOPOdca&MyR^jhhKrdQO?_jJh`xV3GKz&2lvP8 zEOwW6L*ufvK;TN{=S&R@pzV^U=QNk^Ec}5H z+2~JvEVA{`uMAr)?Kf|aW>33`)UL@bnfIUQc~L;TsTQ6>r-<^rB8uoNOJ>HWgqMI8 zSW}pZmp_;z_2O5_RD|fGyTxaxk53Hg_3Khc<8AUzV|ZeK{fp|Ne933=1&_^Dbv5^u zB9n=*)k*tjHDRJ@$bp9mrh}qFn*s}npMl5BMDC%Hs0M0g-hW~P*3CNG06G!MOPEQ_ zi}Qs-6M8aMt;sL$vlmVBR^+Ry<64jrm1EI1%#j?c?4b*7>)a{aDw#TfTYKq+SjEFA z(aJ&z_0?0JB83D-i3Vh+o|XV4UP+YJ$9Boid2^M2en@APw&wx7vU~t$r2V`F|7Qfo z>WKgI@eNBZ-+Og<{u2ZiG%>YvH2L3fNpV9J;WLJoBZda)01Rn;o@){01{7E#ke(7U zHK>S#qZ(N=aoae*4X!0A{)nu0R_sKpi1{)u>GVjC+b5Jyl6#AoQ-1_3UDovNSo`T> z?c-@7XX*2GMy?k?{g)7?Sv;SJkmxYPJPs!&QqB12ejq`Lee^-cDveVWL^CTUldb(G zjDGe(O4P=S{4fF=#~oAu>LG>wrU^z_?3yt24FOx>}{^lCGh8?vtvY$^hbZ)9I0E3r3NOlb9I?F-Yc=r$*~l`4N^xzlV~N zl~#oc>U)Yjl0BxV>O*Kr@lKT{Z09OXt2GlvE38nfs+DD7exl|&vT;)>VFXJVZp9Np zDK}aO;R3~ag$X*|hRVY3OPax|PG`@_ESc8E!mHRByJbZQRS38V2F__7MW~sgh!a>98Q2%lUNFO=^xU52|?D=IK#QjwBky-C>zOWlsiiM&1n z;!&1((Xn1$9K}xabq~222gYvx3hnZPg}VMF_GV~5ocE=-v>V=T&RsLBo&`)DOyIj* zLV{h)JU_y*7SdRtDajP_Y+rBkNN*1_TXiKwHH2&p51d(#zv~s#HwbNy?<+(=9WBvo zw2hkk2Dj%kTFhY+$T+W-b7@qD!bkfN#Z2ng@Pd=i3-i?xYfs5Z*1hO?kd7Sp^9`;Y zM2jeGg<-nJD1er@Pc_cSY7wo5dzQX44=%6rn}P_SRbpzsA{6B+!$3B0#;}qwO37G^ zL(V_5JK`XT?OHVk|{_$vQ|oNEpab*BO4F zUTNQ7RUhnRsU`TK#~`)$icsvKh~(pl=3p6m98@k3P#~upd=k*u20SNcb{l^1rUa)>qO997)pYRWMncC8A&&MHlbW?7i^7M`+B$hH~Y|J zd>FYOGQ;j>Zc2e7R{KK7)0>>nn_jYJy&o@sK!4G>-rLKM8Hv)f;hi1D2fAc$+six2 zyVZ@wZ6x|fJ!4KrpCJY=!Mq0;)X)OoS~{Lkh6u8J`eK%u0WtKh6B>GW_)PVc zl}-k`p09qwGtZ@VbYJC!>29V?Dr>>vk?)o(x?!z*9DJ||9qG-&G~#kXxbw{KKYy}J zQKa-dPt~M~E}V?PhW0R26xdA%1T*%ra6SguGu50YHngOTIv)@N|YttEXo#OZfgtP7;H?EeZZxo<}3YlYxtBq znJ!WFR^tmGf0Py}N?kZ(#=VtpC@%xJkDmfcCoBTxq zr_|5gP?u1@vJZbxPZ|G0AW4=tpb84gM2DpJU||(b8kMOV1S3|(yuwZJ&rIiFW(U;5 zUtAW`O6F6Zy+eZ1EDuP~AAHlSY-+A_eI5Gx)%*uro5tljy}kCZU*_d7)oJ>oQSZ3* zneTn`{gnNC&uJd)0aMBzAg021?YJ~b(fmkwZAd696a=0NzBAqBN54KuNDwa*no(^O z6p05bioXUR^uXjpTol*ppHp%1v9e)vkoUAUJyBx3lw0UO39b0?^{}yb!$yca(@DUn zCquRF?t=Zb9`Ed3AI6|L{eX~ijVH`VzSMheKoP7LSSf4g>md>`yi!TkoG5P>Ofp+n z(v~rW+(5L96L{vBb^g51B=(o)?%%xhvT*A5btOpw(TKh^g^4c zw>0%X!_0`{iN%RbVk+A^f{w-4-SSf*fu@FhruNL##F~sF24O~u zyYF<3el2b$$wZ_|uW#@Ak+VAGk#e|kS8nL1g>2B-SNMjMp^8;-FfeofY2fphFHO!{ z*!o4oTb{4e;S<|JEs<1_hPsmAlVNk?_5-Fp5KKU&d#FiNW~Y+pVFk@Cua1I{T+1|+ zHx6rFMor)7L)krbilqsWwy@T+g3DiH5MyVf8Wy}XbEaoFIDr~y;@r&I>FMW{ z?Q+(IgyebZ)-i4jNoXQhq4Muy9Fv+OxU;9_Jmn+<`mEC#%2Q_2bpcgzcinygNI!&^ z=V$)o2&Yz04~+&pPWWn`rrWxJ&}8khR)6B(--!9Q zubo}h+1T)>a@c)H^i``@<^j?|r4*{;tQf78(xn0g39IoZw0(CwY1f<%F>kEaJ zp9u|IeMY5mRdAlw*+gSN^5$Q)ShM<~E=(c8QM+T-Qk)FyKz#Sw0EJ*edYcuOtO#~Cx^(M7w5 z3)rl#L)rF|(Vun2LkFr!rg8Q@=r>9p>(t3Gf_auiJ2Xx9HmxYTa|=MH_SUlYL`mz9 zTTS$`%;D-|Jt}AP1&k7PcnfFNTH0A-*FmxstjBDiZX?}%u%Yq94$fUT&z6od+(Uk> zuqsld#G(b$G8tus=M!N#oPd|PVFX)?M?tCD0tS%2IGTfh}3YA3f&UM)W$_GNV8 zQo+a(ml2Km4o6O%gKTCSDNq+#zCTIQ1*`TIJh~k6Gp;htHBFnne))rlFdGqwC6dx2+La1&Mnko*352k0y z+tQcwndQlX`nc6nb$A9?<-o|r*%aWXV#=6PQic0Ok_D;q>wbv&j7cKc!w4~KF#-{6 z(S%6Za)WpGIWf7jZ3svNG5OLs0>vCL9{V7cgO%zevIVMH{WgP*^D9ws&OqA{yr|m| zKD4*07dGXshJHd#e%x%J+qmS^lS|0Bp?{drv;{@{l9ArPO&?Q5=?OO9=}h$oVe#3b z3Yofj&Cb}WC$PxmRRS)H%&$1-)z7jELS}!u!zQ?A^Y{Tv4QVt*vd@uj-^t2fYRzQj zfxGR>-q|o$3sGn^#VzZ!QQx?h9`njeJry}@x?|k0-GTTA4y3t2E`3DZ!A~D?GiJup z)8%PK2^9OVRlP(24P^4_<|D=H^7}WlWu#LgsdHzB%cPy|f8dD3|A^mh4WXxhLTVu_ z@abE{6Saz|Y{rXYPd4$tfPYo}ef(oQWZ=4Bct-=_9`#Qgp4ma$n$`tOwq#&E18$B; z@Bp)bn3&rEi0>fWWZ@7k5WazfoX`SCO4jQWwVuo+$PmSZn^Hz?O(-tW@*DGxuf)V1 zO_xm&;NVCaHD4dqt(-MlszI3F-p?0!-e$fbiCeuaw66h^TTDLWuaV<@C-`=Xe5WL) zwooG7h>4&*)p3pKMS3O!4>-4jQUN}iAMQ)2*70?hP~)TzzR?-f@?Aqy$$1Iy8VGG$ zMM?8;j!pUX7QQD$gRc_#+=raAS577ga-w?jd`vCiN5lu)dEUkkUPl9!?{$IJNxQys z*E4e$eF&n&+AMRQR2gcaFEjAy*r)G!s(P6D&TfoApMFC_*Ftx0|D0@E-=B7tezU@d zZ{hGiN;YLIoSeRS;9o%dEua4b%4R3;$SugDjP$x;Z!M!@QibuSBb)HY!3zJ7M;^jw zlx6AD50FD&p3JyP*>o+t9YWW8(7P2t!VQQ21pHJOcG_SXQD;(5aX#M6x##5H_Re>6lPyDCjxr*R(+HE%c&QN+b^tbT zXBJk?p)zhJj#I?&Y2n&~XiytG9!1ox;bw5Rbj~)7c(MFBb4>IiRATdhg zmiEFlj@S_hwYYI(ki{}&<;_7(Z0Qkfq>am z&LtL=2qc7rWguk3BtE4zL41@#S;NN*-jWw|7Kx7H7~_%7fPt;TIX}Ubo>;Rmj94V> zNB1=;-9AR7s`Pxn}t_6^3ahlq53e&!Lh85uG zec0vJY_6e`tg7LgfrJ3k!DjR)Bi#L@DHIrZ`sK=<5O0Ip!fxGf*OgGSpP@Hbbe&$9 z;ZI}8lEoC2_7;%L2=w?tb%1oL0V+=Z`7b=P&lNGY;yVBazXRYu;+cQDKvm*7NCxu&i;zub zAJh#11%?w>E2rf2e~C4+rAb-&$^vsdACs7 z@|Ra!OfVM(ke{vyiqh7puf&Yp6cd6{DptUteYfIRWG3pI+5< zBVBI_xkBAc<(pcb$!Y%dTW(b;B;2pOI-(QCsLv@U-D1XJ z(Gk8Q3l7Ws46Aktuj>|s{$6zA&xCPuXL-kB`CgYMs}4IeyG*P51IDwW?8UNQd+$i~ zlxOPtSi5L|gJcF@DwmJA5Ju8HEJ>o{{upwIpb!f{2(vLNBw`7xMbvcw<^{Fj@E~1( z?w`iIMieunS#>nXlmUcSMU+D3rX28f?s7z;X=se6bo8;5vM|O^(D6{A9*ChnGH!RG zP##3>LDC3jZPE4PH32AxrqPk|yIIrq~`aL-=}`okhNu9aT%q z1b)7iJ)CN=V#Ly84N_r7U^SH2FGdE5FpTO2 z630TF$P>GNMu8`rOytb(lB2};`;P4YNwW1<5d3Q~AX#P0aX}R2b2)`rgkp#zTxcGj zAV^cvFbhP|JgWrq_e`~exr~sIR$6p5V?o4Wym3kQ3HA+;Pr$bQ0(PmADVO%MKL!^q z?zAM8j1l4jrq|5X+V!8S*2Wl@=7*pPgciTVK6kS1Ge zMsd_u6DFK$jTnvVtE;qa+8(1sGBu~n&F%dh(&c(Zs4Fc#A=gG^^%^AyH}1^?|8quj zl@Z47h$){PlELJgYZCIHHL= z{U8O>Tw4x3<1{?$8>k-P<}1y9DmAZP_;(3Y*{Sk^H^A=_iSJ@+s5ktgwTXz_2$~W9>VVZsfwCm@s0sQ zeB50_yu@uS+e7QoPvdCwDz{prjo(AFwR%C?z`EL{1`|coJHQTk^nX=tvs1<0arUOJ z!^`*x&&BvTYmemyZ)2p~{%eYX=JVR?DYr(rNgqRMA5E1PR1Iw=prk=L2ldy3r3Vg@27IZx43+ywyzr-X*p*d@tZV+!U#~$-q=8c zgdSuh#r?b4GhEGNai)ayHQpk>5(%j5c@C1K3(W1pb~HeHpaqijJZa-e6vq_8t-^M^ zBJxq|MqZc?pjXPIH}70a5vt!IUh;l}<>VX<-Qcv^u@5(@@M2CHSe_hD$VG-eiV^V( zj7*9T0?di?P$FaD6oo?)<)QT>Npf6Og!GO^GmPV(Km0!=+dE&bk#SNI+C9RGQ|{~O*VC+tXK3!n`5 zHfl6>lwf_aEVV3`0T!aHNZLsj$paS$=LL(?b!Czaa5bbSuZ6#$_@LK<(7yrrl+80| z{tOFd=|ta2Z`^ssozD9BINn45NxUeCQis?-BKmU*Kt=FY-NJ+)8S1ecuFtN-M?&42 zl2$G>u!iNhAk*HoJ^4v^9#ORYp5t^wDj6|lx~5w45#E5wVqI1JQ~9l?nPp1YINf++ zMAdSif~_ETv@Er(EFBI^@L4BULFW>)NI+ejHFP*T}UhWNN`I)RRS8za? z*@`1>9ZB}An%aT5K=_2iQmfE;GcBVHLF!$`I99o5GO`O%O_zLr9AG18>&^HkG(;=V z%}c!OBQ~?MX(9h~tajX{=x)+!cbM7$YzTlmsPOdp2L-?GoW`@{lY9U3f;OUo*BwRB z8A+nv(br0-SH#VxGy#ZrgnGD(=@;HME;yd46EgWJ`EL%oXc&lFpc@Y}^>G(W>h_v_ zlN!`idhX+OjL+~T?19sroAFVGfa5tX-D49w$1g2g_-T|EpHL6}K_aX4$K=LTvwtlF zL*z}j{f+Uoe7{-px3_5iKPA<_7W=>Izkk)!l9ez2w%vi(?Y;i8AxRNLSOGDzNoqoI zP!1uAl}r=_871(G?y`i&)-7{u=%nxk7CZ_Qh#!|ITec zwQn`33GTUM`;D2POWnkqngqJhJRlM>CTONzTG}>^Q0wUunQyn|TAiHzyX2_%ATx%P z%7gW)%4rA9^)M<_%k@`Y?RbC<29sWU&5;@|9thf2#zf8z12$hRcZ!CSb>kUp=4N#y zl3hE#y6>kkA8VY2`W`g5Ip?2qC_BY$>R`iGQLhz2-S>x(RuWv)SPaGdl^)gGw7tjR zH@;jwk!jIaCgSg_*9iF|a);sRUTq30(8I(obh^|}S~}P4U^BIGYqcz;MPpC~Y@k_m zaw4WG1_vz2GdCAX!$_a%GHK**@IrHSkGoN>)e}>yzUTm52on`hYot7cB=oA-h1u|R ztH$11t?54Qg2L+i33FPFKKRm1aOjKST{l1*(nps`>sv%VqeVMWjl5+Gh+9);hIP8? zA@$?}Sc z3qIRpba+y5yf{R6G(u8Z^vkg0Fu&D-7?1s=QZU`Ub{-!Y`I?AGf1VNuc^L3v>)>i# z{DV9W$)>34wnzAXUiV^ZpYKw>UElrN_5Xj6{r_3| z$X5PK`e5$7>~9Dj7gK5ash(dvs`vwfk}&RD`>04;j62zoXESkFBklYaKm5seyiX(P zqQ-;XxlV*yg?Dhlx%xt!b0N3GHp@(p$A;8|%# zZ5m2KL|{on4nr>2_s9Yh=r5ScQ0;aMF)G$-9-Ca6%wA`Pa)i?NGFA|#Yi?{X-4ZO_ z^}%7%vkzvUHa$-^Y#aA+aiR5sa%S|Ebyn`EV<3Pc?ax_f>@sBZF1S;7y$CXd5t5=WGsTKBk8$OfH4v|0?0I=Yp}7c=WBSCg!{0n)XmiU;lfx)**zZaYqmDJelxk$)nZyx5`x$6R|fz(;u zEje5Dtm|a%zK!!tk3{i9$I2b{vXNFy%Bf{50X!x{98+BsDr_u9i>G5%*sqEX|06J0 z^IY{UcEbj6LDwuMh7cH`H@9sVt1l1#8kEQ(LyT@&+K}(ReE`ux8gb0r6L_#bDUo^P z3Ka2lRo52Hdtl_%+pwVs14=q`{d^L58PsU@AMf(hENumaxM{7iAT5sYmWh@hQCO^ zK&}ijo=`VqZ#a3vE?`7QW0ZREL17ZvDfdqKGD?0D4fg{7v%|Yj&_jcKJAB)>=*RS* zto8p6@k%;&^ZF>hvXm&$PCuEp{uqw3VPG$9VMdW5$w-fy2CNNT>E;>ejBgy-m_6`& z97L1p{%srn@O_JQgFpa_#f(_)eb#YS>o>q3(*uB;uZb605(iqM$=NK{nHY=+X2*G) zO3-_Xh%aG}fHWe*==58zBwp%&`mge<8uq8;xIxOd=P%9EK!34^E9sk|(Zq1QSz-JVeP12Fp)-`F|KY$LPwUE?rku zY@OJ)Z9A!ojfzfeyJ9;zv2EM7ZQB)AR5xGa-tMn^bl)FmoIiVyJ@!~@%{}qXXD&Ns zPnfe5U+&ohKefILu_1mPfLGuapX@btta5C#gPB2cjk5m4T}Nfi+Vfka!Yd(L?-c~5 z#ZK4VeQEXNPc4r$K00Fg>g#_W!YZ)cJ?JTS<&68_$#cZT-ME`}tcwqg3#``3M3UPvn+pi}(VNNx6y zFIMVb6OwYU(2`at$gHba*qrMVUl8xk5z-z~fb@Q3Y_+aXuEKH}L+>eW__!IAd@V}L zkw#s%H0v2k5-=vh$^vPCuAi22Luu3uKTf6fPo?*nvj$9(u)4$6tvF-%IM+3pt*cgs z_?wW}J7VAA{_~!?))?s6{M=KPpVhg4fNuU*|3THp@_(q!b*hdl{fjRVFWtu^1dV(f z6iOux9hi&+UK=|%M*~|aqFK{Urfl!TA}UWY#`w(0P!KMe1Si{8|o))Gy6d7;!JQYhgMYmXl?3FfOM2nQGN@~Ap6(G z3+d_5y@=nkpKAhRqf{qQ~k7Z$v&l&@m7Ppt#FSNzKPZM z8LhihcE6i=<(#87E|Wr~HKvVWhkll4iSK$^mUHaxgy8*K$_Zj;zJ`L$naPj+^3zTi z-3NTaaKnD5FPY-~?Tq6QHnmDDRxu0mh0D|zD~Y=vv_qig5r-cIbCpxlju&8Sya)@{ zsmv6XUSi)@(?PvItkiZEeN*)AE~I_?#+Ja-r8$(XiXei2d@Hi7Rx8+rZZb?ZLa{;@*EHeRQ-YDadz~M*YCM4&F-r;E#M+@CSJMJ0oU|PQ^ z=E!HBJDMQ2TN*Y(Ag(ynAL8%^v;=~q?s4plA_hig&5Z0x_^Oab!T)@6kRN$)qEJ6E zNuQjg|G7iwU(N8pI@_6==0CL;lRh1dQF#wePhmu@hADFd3B5KIH#dx(2A zp~K&;Xw}F_N6CU~0)QpQk7s$a+LcTOj1%=WXI(U=Dv!6 z{#<#-)2+gCyyv=Jw?Ab#PVkxPDeH|sAxyG`|Ys}A$PW4TdBv%zDz z^?lwrxWR<%Vzc8Sgt|?FL6ej_*e&rhqJZ3Y>k=X(^dytycR;XDU16}Pc9Vn0>_@H+ zQ;a`GSMEG64=JRAOg%~L)x*w{2re6DVprNp+FcNra4VdNjiaF0M^*>CdPkt(m150rCue?FVdL0nFL$V%5y6N z%eLr5%YN7D06k5ji5*p4v$UMM)G??Q%RB27IvH7vYr_^3>1D-M66#MN8tWGw>WED} z5AhlsanO=STFYFs)Il_0i)l)f<8qn|$DW7ZXhf5xI;m+7M5-%P63XFQrG9>DMqHc} zsgNU9nR`b}E^mL5=@7<1_R~j@q_2U^3h|+`7YH-?C=vme1C3m`Fe0HC>pjt6f_XMh zy~-i-8R46QNYneL4t@)<0VU7({aUO?aH`z4V2+kxgH5pYD5)wCh75JqQY)jIPN=U6 z+qi8cGiOtXG2tXm;_CfpH9ESCz#i5B(42}rBJJF$jh<1sbpj^8&L;gzGHb8M{of+} zzF^8VgML2O9nxBW7AvdEt90vp+#kZxWf@A)o9f9}vKJy9NDBjBW zSt=Hcs=YWCwnfY1UYx*+msp{g!w0HC<_SM!VL1(I2PE?CS}r(eh?{I)mQixmo5^p# zV?2R!R@3GV6hwTCrfHiK#3Orj>I!GS2kYhk1S;aFBD_}u2v;0HYFq}Iz1Z(I4oca4 zxquja8$+8JW_EagDHf$a1OTk5S97umGSDaj)gH=fLs9>_=XvVj^Xj9a#gLdk=&3tl zfmK9MNnIX9v{?%xdw7568 zNrZ|roYs(vC4pHB5RJ8>)^*OuyNC>x7ad)tB_}3SgQ96+-JT^Qi<`xi=)_=$Skwv~ zdqeT9Pa`LYvCAn&rMa2aCDV(TMI#PA5g#RtV|CWpgDYRA^|55LLN^uNh*gOU>Z=a06qJ;$C9z8;n-Pq=qZnc1zUwJ@t)L;&NN+E5m zRkQ(SeM8=l-aoAKGKD>!@?mWTW&~)uF2PYUJ;tB^my`r9n|Ly~0c%diYzqs9W#FTjy?h&X3TnH zXqA{QI82sdjPO->f=^K^f>N`+B`q9&rN0bOXO79S&a9XX8zund(kW7O76f4dcWhIu zER`XSMSFbSL>b;Rp#`CuGJ&p$s~G|76){d?xSA5wVg##_O0DrmyEYppyBr%fyWbbv zp`K84JwRNP$d-pJ!Qk|(RMr?*!wi1if-9G#0p>>1QXKXWFy)eB3ai)l3601q8!9JC zvU#ZWWDNKq9g6fYs?JQ)Q4C_cgTy3FhgKb8s&m)DdmL5zhNK#8wWg!J*7G7Qhe9VU zha?^AQTDpYcuN!B+#1dE*X{<#!M%zfUQbj=zLE{dW0XeQ7-oIsGY6RbkP2re@Q{}r_$iiH0xU%iN*ST`A)-EH6eaZB$GA#v)cLi z*MpA(3bYk$oBDKAzu^kJoSUsDd|856DApz={3u8sbQV@JnRkp2nC|)m;#T=DvIL-O zI4vh;g7824l}*`_p@MT4+d`JZ2%6NQh=N9bmgJ#q!hK@_<`HQq3}Z8Ij>3%~<*= zcv=!oT#5xmeGI92lqm9sGVE%#X$ls;St|F#u!?5Y7syhx6q#MVRa&lBmmn%$C0QzU z);*ldgwwCmzM3uglr}!Z2G+?& zf%Dpo&mD%2ZcNFiN-Z0f;c_Q;A%f@>26f?{d1kxIJD}LxsQkB47SAdwinfMILZdN3 zfj^HmTzS3Ku5BxY>ANutS8WPQ-G>v4^_Qndy==P3pDm+Xc?>rUHl-4+^%Sp5atOja z2oP}ftw-rqnb}+khR3CrRg^ibi6?QYk1*i^;kQGirQ=uB9Sd1NTfT-Rbv;hqnY4neE5H1YUrjS2m+2&@uXiAo- zrKUX|Ohg7(6F(AoP~tj;NZlV#xsfo-5reuQHB$&EIAhyZk;bL;k9ouDmJNBAun;H& zn;Of1z_Qj`x&M;5X;{s~iGzBQTY^kv-k{ksbE*Dl%Qf%N@hQCfY~iUw!=F-*$cpf2 z3wix|aLBV0b;W@z^%7S{>9Z^T^fLOI68_;l@+Qzaxo`nAI8emTV@rRhEKZ z?*z_{oGdI~R*#<2{bkz$G~^Qef}$*4OYTgtL$e9q!FY7EqxJ2`zk6SQc}M(k(_MaV zSLJnTXw&@djco1~a(vhBl^&w=$fa9{Sru>7g8SHahv$&Bl(D@(Zwxo_3r=;VH|uc5 zi1Ny)J!<(KN-EcQ(xlw%PNwK8U>4$9nVOhj(y0l9X^vP1TA>r_7WtSExIOsz`nDOP zs}d>Vxb2Vo2e5x8p(n~Y5ggAyvib>d)6?)|E@{FIz?G3PVGLf7-;BxaP;c?7ddH$z zA+{~k^V=bZuXafOv!RPsE1GrR3J2TH9uB=Z67gok+u`V#}BR86hB1xl}H4v`F+mRfr zYhortD%@IGfh!JB(NUNSDh+qDz?4ztEgCz&bIG-Wg7w-ua4ChgQR_c+z8dT3<1?uX z*G(DKy_LTl*Ea!%v!RhpCXW1WJO6F`bgS-SB;Xw9#! z<*K}=#wVu9$`Yo|e!z-CPYH!nj7s9dEPr-E`DXUBu0n!xX~&|%#G=BeM?X@shQQMf zMvr2!y7p_gD5-!Lnm|a@z8Of^EKboZsTMk%5VsJEm>VsJ4W7Kv{<|#4f-qDE$D-W>gWT%z-!qXnDHhOvLk=?^a1*|0j z{pW{M0{#1VcR5;F!!fIlLVNh_Gj zbnW(_j?0c2q$EHIi@fSMR{OUKBcLr{Y&$hrM8XhPByyZaXy|dd&{hYQRJ9@Fn%h3p7*VQolBIV@Eq`=y%5BU~3RPa^$a?ixp^cCg z+}Q*X+CW9~TL29@OOng(#OAOd!)e$d%sr}^KBJ-?-X&|4HTmtemxmp?cT3uA?md4% zT8yZ0U;6Rg6JHy3fJae{6TMGS?ZUX6+gGTT{Q{)SI85$5FD{g-eR%O0KMpWPY`4@O zx!hen1*8^E(*}{m^V_?}(b5k3hYo=T+$&M32+B`}81~KKZhY;2H{7O-M@vbCzuX0n zW-&HXeyr1%I3$@ns-V1~Lb@wIpkmx|8I~ob1Of7i6BTNysEwI}=!nU%q7(V_^+d*G z7G;07m(CRTJup!`cdYi93r^+LY+`M*>aMuHJm(A8_O8C#A*$!Xvddgpjx5)?_EB*q zgE8o5O>e~9IiSC@WtZpF{4Bj2J5eZ>uUzY%TgWF7wdDE!fSQIAWCP)V{;HsU3ap?4 znRsiiDbtN7i9hapO;(|Ew>Ip2TZSvK9Z^N21%J?OiA_&eP1{(Pu_=%JjKy|HOardq ze?zK^K zA%sjF64*Wufad%H<) z^|t>e*h+Z1#l=5wHexzt9HNDNXgM=-OPWKd^5p!~%SIl>Fo&7BvNpbf8{NXmH)o{r zO=aBJ;meX1^{O%q;kqdw*5k!Y7%t_30 zy{nGRVc&5qt?dBwLs+^Sfp;f`YVMSB#C>z^a9@fpZ!xb|b-JEz1LBX7ci)V@W+kvQ89KWA0T~Lj$aCcfW#nD5bt&Y_< z-q{4ZXDqVg?|0o)j1%l0^_it0WF*LCn-+)c!2y5yS7aZIN$>0LqNnkujV*YVes(v$ zY@_-!Q;!ZyJ}Bg|G-~w@or&u0RO?vlt5*9~yeoPV_UWrO2J54b4#{D(D>jF(R88u2 zo#B^@iF_%S>{iXSol8jpmsZuJ?+;epg>k=$d`?GSegAVp3n$`GVDvK${N*#L_1`44 z{w0fL{2%)0|E+qgZtjX}itZz^KJt4Y;*8uSK}Ft38+3>j|K(PxIXXR-t4VopXo#9# zt|F{LWr-?34y`$nLBVV_*UEgA6AUI65dYIbqpNq9cl&uLJ0~L}<=ESlOm?Y-S@L*d z<7vt}`)TW#f%Rp$Q}6@3=j$7Tze@_uZO@aMn<|si{?S}~maII`VTjs&?}jQ4_cut9$)PEqMukwoXobzaKx^MV z2fQwl+;LSZ$qy%Tys0oo^K=jOw$!YwCv^ei4NBVauL)tN%=wz9M{uf{IB(BxK|lT*pFkmNK_1tV`nb%jH=a0~VNq2RCKY(rG7jz!-D^k)Ec)yS%17pE#o6&eY+ z^qN(hQT$}5F(=4lgNQhlxj?nB4N6ntUY6(?+R#B?W3hY_a*)hnr4PA|vJ<6p`K3Z5Hy z{{8(|ux~NLUW=!?9Qe&WXMTAkQnLXg(g=I@(VG3{HE13OaUT|DljyWXPs2FE@?`iU z4GQlM&Q=T<4&v@Fe<+TuXiZQT3G~vZ&^POfmI1K2h6t4eD}Gk5XFGpbj1n_g*{qmD6Xy z`6Vv|lLZtLmrnv*{Q%xxtcWVj3K4M%$bdBk_a&ar{{GWyu#ljM;dII;*jP;QH z#+^o-A4np{@|Mz+LphTD0`FTyxYq#wY)*&Ls5o{0z9yg2K+K7ZN>j1>N&;r+Z`vI| zDzG1LJZ+sE?m?>x{5LJx^)g&pGEpY=fQ-4}{x=ru;}FL$inHemOg%|R*ZXPodU}Kh zFEd5#+8rGq$Y<_?k-}r5zgQ3jRV=ooHiF|@z_#D4pKVEmn5CGV(9VKCyG|sT9nc=U zEoT67R`C->KY8Wp-fEcjjFm^;Cg(ls|*ABVHq8clBE(;~K^b+S>6uj70g? z&{XQ5U&!Z$SO7zfP+y^8XBbiu*Cv-yJG|l-oe*!s5$@Lh_KpxYL2sx`B|V=dETN>5K+C+CU~a_3cI8{vbu$TNVdGf15*>D zz@f{zIlorkY>TRh7mKuAlN9A0>N>SV`X)+bEHms=mfYTMWt_AJtz_h+JMmrgH?mZt zm=lfdF`t^J*XLg7v+iS)XZROygK=CS@CvUaJo&w2W!Wb@aa?~Drtf`JV^cCMjngVZ zv&xaIBEo8EYWuML+vxCpjjY^s1-ahXJzAV6hTw%ZIy!FjI}aJ+{rE&u#>rs)vzuxz z+$5z=7W?zH2>Eb32dvgHYZtCAf!=OLY-pb4>Ae79rd68E2LkVPj-|jFeyqtBCCwiW zkB@kO_(3wFq)7qwV}bA=zD!*@UhT`geq}ITo%@O(Z5Y80nEX~;0-8kO{oB6|(4fQh z);73T!>3@{ZobPwRv*W?7m0Ml9GmJBCJd&6E?hdj9lV= z4flNfsc(J*DyPv?RCOx!MSvk(M952PJ-G|JeVxWVjN~SNS6n-_Ge3Q;TGE;EQvZg86%wZ`MB zSMQua(i*R8a75!6$QRO^(o7sGoomb+Y{OMy;m~Oa`;P9Yqo>?bJAhqXxLr7_3g_n>f#UVtxG!^F#1+y@os6x(sg z^28bsQ@8rw%Gxk-stAEPRbv^}5sLe=VMbkc@Jjimqjvmd!3E7+QnL>|(^3!R} zD-l1l7*Amu@j+PWLGHXXaFG0Ct2Q=}5YNUxEQHCAU7gA$sSC<5OGylNnQUa>>l%sM zyu}z6i&({U@x^hln**o6r2s-(C-L50tQvz|zHTqW!ir?w&V23tuYEDJVV#5pE|OJu z7^R!A$iM$YCe?8n67l*J-okwfZ+ZTkGvZ)tVPfR;|3gyFjF)8V zyXXN=!*bpyRg9#~Bg1+UDYCt0 ztp4&?t1X0q>uz;ann$OrZs{5*r`(oNvw=$7O#rD|Wuv*wIi)4b zGtq4%BX+kkagv3F9Id6~-c+1&?zny%w5j&nk9SQfo0k4LhdSU_kWGW7axkfpgR`8* z!?UTG*Zi_baA1^0eda8S|@&F z{)Rad0kiLjB|=}XFJhD(S3ssKlveFFmkN{Vl^_nb!o5M!RC=m)V&v2%e?ZoRC@h3> zJ(?pvToFd`*Zc@HFPL#=otWKwtuuQ_dT-Hr{S%pQX<6dqVJ8;f(o)4~VM_kEQkMR+ zs1SCVi~k>M`u1u2xc}>#D!V&6nOOh-E$O&SzYrjJdZpaDv1!R-QGA141WjQe2s0J~ zQ;AXG)F+K#K8_5HVqRoRM%^EduqOnS(j2)|ctA6Q^=|s_WJYU;Z%5bHp08HPL`YF2 zR)Ad1z{zh`=sDs^&V}J z%$Z$!jd7BY5AkT?j`eqMs%!Gm@T8)4w3GYEX~IwgE~`d|@T{WYHkudy(47brgHXx& zBL1yFG6!!!VOSmDxBpefy2{L_u5yTwja&HA!mYA#wg#bc-m%~8aRR|~AvMnind@zs zy>wkShe5&*un^zvSOdlVu%kHsEo>@puMQ`b1}(|)l~E{5)f7gC=E$fP(FC2=F<^|A zxeIm?{EE!3sO!Gr7e{w)Dx(uU#3WrFZ>ibmKSQ1tY?*-Nh1TDHLe+k*;{Rp!Bmd_m zb#^kh`Y*8l|9Cz2e{;RL%_lg{#^Ar+NH|3z*Zye>!alpt{z;4dFAw^^H!6ING*EFc z_yqhr8d!;%nHX9AKhFQZBGrSzfzYCi%C!(Q5*~hX>)0N`vbhZ@N|i;_972WSx*>LH z87?en(;2_`{_JHF`Sv6Wlps;dCcj+8IJ8ca6`DsOQCMb3n# z3)_w%FuJ3>fjeOOtWyq)ag|PmgQbC-s}KRHG~enBcIwqIiGW8R8jFeBNY9|YswRY5 zjGUxdGgUD26wOpwM#8a!Nuqg68*dG@VM~SbOroL_On0N6QdT9?)NeB3@0FCC?Z|E0 z6TPZj(AsPtwCw>*{eDEE}Gby>0q{*lI+g2e&(YQrsY&uGM{O~}(oM@YWmb*F zA0^rr5~UD^qmNljq$F#ARXRZ1igP`MQx4aS6*MS;Ot(1L5jF2NJ;de!NujUYg$dr# z=TEL_zTj2@>ZZN(NYCeVX2==~=aT)R30gETO{G&GM4XN<+!&W&(WcDP%oL8PyIVUC zs5AvMgh6qr-2?^unB@mXK*Dbil^y-GTC+>&N5HkzXtozVf93m~xOUHn8`HpX=$_v2 z61H;Z1qK9o;>->tb8y%#4H)765W4E>TQ1o0PFj)uTOPEvv&}%(_mG0ISmyhnQV33Z$#&yd{ zc{>8V8XK$3u8}04CmAQ#I@XvtmB*s4t8va?-IY4@CN>;)mLb_4!&P3XSw4pA_NzDb zORn!blT-aHk1%Jpi>T~oGLuh{DB)JIGZ9KOsciWs2N7mM1JWM+lna4vkDL?Q)z_Ct z`!mi0jtr+4*L&N7jk&LodVO#6?_qRGVaucqVB8*us6i3BTa^^EI0x%EREQSXV@f!lak6Wf1cNZ8>*artIJ(ADO*=<-an`3zB4d*oO*8D1K!f z*A@P1bZCNtU=p!742MrAj%&5v%Xp_dSX@4YCw%F|%Dk=u|1BOmo)HsVz)nD5USa zR~??e61sO(;PR)iaxK{M%QM_rIua9C^4ppVS$qCT9j2%?*em?`4Z;4@>I(c%M&#cH z>4}*;ej<4cKkbCAjjDsyKS8rIm90O)Jjgyxj5^venBx&7B!xLmzxW3jhj7sR(^3Fz z84EY|p1NauwXUr;FfZjdaAfh%ivyp+^!jBjJuAaKa!yCq=?T_)R!>16?{~p)FQ3LDoMyG%hL#pR!f@P%*;#90rs_y z@9}@r1BmM-SJ#DeuqCQk=J?ixDSwL*wh|G#us;dd{H}3*-Y7Tv5m=bQJMcH+_S`zVtf;!0kt*(zwJ zs+kedTm!A}cMiM!qv(c$o5K%}Yd0|nOd0iLjus&;s0Acvoi-PFrWm?+q9f^FslxGi z6ywB`QpL$rJzWDg(4)C4+!2cLE}UPCTBLa*_=c#*$b2PWrRN46$y~yST3a2$7hEH= zNjux+wna^AzQ=KEa_5#9Ph=G1{S0#hh1L3hQ`@HrVnCx{!fw_a0N5xV(iPdKZ-HOM za)LdgK}1ww*C_>V7hbQnTzjURJL`S%`6nTHcgS+dB6b_;PY1FsrdE8(2K6FN>37!62j_cBlui{jO^$dPkGHV>pXvW0EiOA zqW`YaSUBWg_v^Y5tPJfWLcLpsA8T zG)!x>pKMpt!lv3&KV!-um= zKCir6`bEL_LCFx4Z5bAFXW$g3Cq`?Q%)3q0r852XI*Der*JNuKUZ`C{cCuu8R8nkt z%pnF>R$uY8L+D!V{s^9>IC+bmt<05h**>49R*#vpM*4i0qRB2uPbg8{{s#9yC;Z18 zD7|4m<9qneQ84uX|J&f-g8a|nFKFt34@Bt{CU`v(SYbbn95Q67*)_Esl_;v291s=9 z+#2F2apZU4Tq=x+?V}CjwD(P=U~d<=mfEFuyPB`Ey82V9G#Sk8H_Ob_RnP3s?)S_3 zr%}Pb?;lt_)Nf>@zX~D~TBr;-LS<1I##8z`;0ZCvI_QbXNh8Iv)$LS=*gHr;}dgb=w5$3k2la1keIm|=7<-JD>)U%=Avl0Vj@+&vxn zt-)`vJxJr88D&!}2^{GPXc^nmRf#}nb$4MMkBA21GzB`-Or`-3lq^O^svO7Vs~FdM zv`NvzyG+0T!P8l_&8gH|pzE{N(gv_tgDU7SWeiI-iHC#0Ai%Ixn4&nt{5y3(GQs)i z&uA;~_0shP$0Wh0VooIeyC|lak__#KVJfxa7*mYmZ22@(<^W}FdKjd*U1CqSjNKW% z*z$5$=t^+;Ui=MoDW~A7;)Mj%ibX1_p4gu>RC}Z_pl`U*{_z@+HN?AF{_W z?M_X@o%w8fgFIJ$fIzBeK=v#*`mtY$HC3tqw7q^GCT!P$I%=2N4FY7j9nG8aIm$c9 zeKTxVKN!UJ{#W)zxW|Q^K!3s;(*7Gbn;e@pQBCDS(I|Y0euK#dSQ_W^)sv5pa%<^o zyu}3d?Lx`)3-n5Sy9r#`I{+t6x%I%G(iewGbvor&I^{lhu-!#}*Q3^itvY(^UWXgvthH52zLy&T+B)Pw;5>4D6>74 zO_EBS)>l!zLTVkX@NDqyN2cXTwsUVao7$HcqV2%t$YzdAC&T)dwzExa3*kt9d(}al zA~M}=%2NVNUjZiO7c>04YH)sRelXJYpWSn^aC$|Ji|E13a^-v2MB!Nc*b+=KY7MCm zqIteKfNkONq}uM;PB?vvgQvfKLPMB8u5+Am=d#>g+o&Ysb>dX9EC8q?D$pJH!MTAqa=DS5$cb+;hEvjwVfF{4;M{5U&^_+r zvZdu_rildI!*|*A$TzJ&apQWV@p{!W`=?t(o0{?9y&vM)V)ycGSlI3`;ps(vf2PUq zX745#`cmT*ra7XECC0gKkpu2eyhFEUb?;4@X7weEnLjXj_F~?OzL1U1L0|s6M+kIhmi%`n5vvDALMagi4`wMc=JV{XiO+^ z?s9i7;GgrRW{Mx)d7rj)?(;|b-`iBNPqdwtt%32se@?w4<^KU&585_kZ=`Wy^oLu9 z?DQAh5z%q;UkP48jgMFHTf#mj?#z|=w= z(q6~17Vn}P)J3M?O)x))%a5+>TFW3No~TgP;f}K$#icBh;rSS+R|}l鯊%1Et zwk~hMkhq;MOw^Q5`7oC{CUUyTw9x>^%*FHx^qJw(LB+E0WBX@{Ghw;)6aA-KyYg8p z7XDveQOpEr;B4je@2~usI5BlFadedX^ma{b{ypd|RNYqo#~d*mj&y`^iojR}s%~vF z(H!u`yx68D1Tj(3(m;Q+Ma}s2n#;O~bcB1`lYk%Irx60&-nWIUBr2x&@}@76+*zJ5 ze&4?q8?m%L9c6h=J$WBzbiTf1Z-0Eb5$IZs>lvm$>1n_Mezp*qw_pr8<8$6f)5f<@ zyV#tzMCs51nTv_5ca`x`yfE5YA^*%O_H?;tWYdM_kHPubA%vy47i=9>Bq) zRQ&0UwLQHeswmB1yP)+BiR;S+Vc-5TX84KUA;8VY9}yEj0eESSO`7HQ4lO z4(CyA8y1G7_C;6kd4U3K-aNOK!sHE}KL_-^EDl(vB42P$2Km7$WGqNy=%fqB+ zSLdrlcbEH=T@W8V4(TgoXZ*G1_aq$K^@ek=TVhoKRjw;HyI&coln|uRr5mMOy2GXP zwr*F^Y|!Sjr2YQXX(Fp^*`Wk905K%$bd03R4(igl0&7IIm*#f`A!DCarW9$h$z`kYk9MjjqN&5-DsH@8xh63!fTNPxWsFQhNv z#|3RjnP$Thdb#Ys7M+v|>AHm0BVTw)EH}>x@_f4zca&3tXJhTZ8pO}aN?(dHo)44Z z_5j+YP=jMlFqwvf3lq!57-SAuRV2_gJ*wsR_!Y4Z(trO}0wmB9%f#jNDHPdQGHFR; zZXzS-$`;7DQ5vF~oSgP3bNV$6Z(rwo6W(U07b1n3UHqml>{=6&-4PALATsH@Bh^W? z)ob%oAPaiw{?9HfMzpGb)@Kys^J$CN{uf*HX?)z=g`J(uK1YO^8~s1(ZIbG%Et(|q z$D@_QqltVZu9Py4R0Ld8!U|#`5~^M=b>fnHthzKBRr=i+w@0Vr^l|W;=zFT#PJ?*a zbC}G#It}rQP^Ait^W&aa6B;+0gNvz4cWUMzpv(1gvfw-X4xJ2Sv;mt;zb2Tsn|kSS zo*U9N?I{=-;a-OybL4r;PolCfiaL=y@o9{%`>+&FI#D^uy#>)R@b^1ue&AKKwuI*` zx%+6r48EIX6nF4o;>)zhV_8(IEX})NGU6Vs(yslrx{5fII}o3SMHW7wGtK9oIO4OM&@@ECtXSICLcPXoS|{;=_yj>hh*%hP27yZwOmj4&Lh z*Nd@OMkd!aKReoqNOkp5cW*lC)&C$P?+H3*%8)6HcpBg&IhGP^77XPZpc%WKYLX$T zsSQ$|ntaVVOoRat$6lvZO(G-QM5s#N4j*|N_;8cc2v_k4n6zx9c1L4JL*83F-C1Cn zaJhd;>rHXB%%ZN=3_o3&Qd2YOxrK~&?1=UuN9QhL$~OY-Qyg&})#ez*8NpQW_*a&kD&ANjedxT0Ar z<6r{eaVz3`d~+N~vkMaV8{F?RBVemN(jD@S8qO~L{rUw#=2a$V(7rLE+kGUZ<%pdr z?$DP|Vg#gZ9S}w((O2NbxzQ^zTot=89!0^~hE{|c9q1hVzv0?YC5s42Yx($;hAp*E zyoGuRyphQY{Q2ee0Xx`1&lv(l-SeC$NEyS~8iil3_aNlnqF_G|;zt#F%1;J)jnPT& z@iU0S;wHJ2$f!juqEzPZeZkjcQ+Pa@eERSLKsWf=`{R@yv7AuRh&ALRTAy z8=g&nxsSJCe!QLchJ=}6|LshnXIK)SNd zRkJNiqHwKK{SO;N5m5wdL&qK`v|d?5<4!(FAsDxR>Ky#0#t$8XCMptvNo?|SY?d8b z`*8dVBlXTUanlh6n)!EHf2&PDG8sXNAt6~u-_1EjPI1|<=33T8 zEnA00E!`4Ave0d&VVh0e>)Dc}=FfAFxpsC1u9ATfQ`-Cu;mhc8Z>2;uyXtqpLb7(P zd2F9<3cXS} znMg?{&8_YFTGRQZEPU-XPq55%51}RJpw@LO_|)CFAt62-_!u_Uq$csc+7|3+TV_!h z+2a7Yh^5AA{q^m|=KSJL+w-EWDBc&I_I1vOr^}P8i?cKMhGy$CP0XKrQzCheG$}G# zuglf8*PAFO8%xop7KSwI8||liTaQ9NCAFarr~psQt)g*pC@9bORZ>m`_GA`_K@~&% zijH0z;T$fd;-Liw8%EKZas>BH8nYTqsK7F;>>@YsE=Rqo?_8}UO-S#|6~CAW0Oz1} z3F(1=+#wrBJh4H)9jTQ_$~@#9|Bc1Pd3rAIA_&vOpvvbgDJOM(yNPhJJq2%PCcMaI zrbe~toYzvkZYQ{ea(Wiyu#4WB#RRN%bMe=SOk!CbJZv^m?Flo5p{W8|0i3`hI3Np# zvCZqY%o258CI=SGb+A3yJe~JH^i{uU`#U#fvSC~rWTq+K`E%J@ zasU07&pB6A4w3b?d?q}2=0rA#SA7D`X+zg@&zm^iA*HVi z009#PUH<%lk4z~p^l0S{lCJk1Uxi=F4e_DwlfHA`X`rv(|JqWKAA5nH+u4Da+E_p+ zVmH@lg^n4ixs~*@gm_dgQ&eDmE1mnw5wBz9Yg?QdZwF|an67Xd*x!He)Gc8&2!urh z4_uXzbYz-aX)X1>&iUjGp;P1u8&7TID0bTH-jCL&Xk8b&;;6p2op_=y^m@Nq*0{#o!!A;wNAFG@0%Z9rHo zcJs?Th>Ny6+hI`+1XoU*ED$Yf@9f91m9Y=#N(HJP^Y@ZEYR6I?oM{>&Wq4|v0IB(p zqX#Z<_3X(&{H+{3Tr|sFy}~=bv+l=P;|sBz$wk-n^R`G3p0(p>p=5ahpaD7>r|>pm zv;V`_IR@tvZreIuv2EM7ZQHhO+qUgw#kOs%*ekY^n|=1#x9&c;Ro&I~{rG-#_3ZB1 z?|9}IFdbP}^DneP*T-JaoYHt~r@EfvnPE5EKUwIxjPbsr$% zfWW83pgWST7*B(o=kmo)74$8UU)v0{@4DI+ci&%=#90}!CZz|rnH+Mz=HN~97G3~@ z;v5(9_2%eca(9iu@J@aqaMS6*$TMw!S>H(b z4(*B!|H|8&EuB%mITr~O?vVEf%(Gr)6E=>H~1VR z&1YOXluJSG1!?TnT)_*YmJ*o_Q@om~(GdrhI{$Fsx_zrkupc#y{DK1WOUR>tk>ZE) ziOLoBkhZZ?0Uf}cm>GsA>Rd6V8@JF)J*EQlQ<=JD@m<)hyElXR0`pTku*3MU`HJn| zIf7$)RlK^pW-$87U;431;Ye4Ie+l~_B3*bH1>*yKzn23cH0u(i5pXV! z4K?{3oF7ZavmmtTq((wtml)m6i)8X6ot_mrE-QJCW}Yn!(3~aUHYG=^fA<^~`e3yc z-NWTb{gR;DOUcK#zPbN^D*e=2eR^_!(!RKkiwMW@@yYtEoOp4XjOGgzi`;=8 zi3`Ccw1%L*y(FDj=C7Ro-V?q)-%p?Ob2ZElu`eZ99n14-ZkEV#y5C+{Pq87Gu3&>g zFy~Wk7^6v*)4pF3@F@rE__k3ikx(hzN3@e*^0=KNA6|jC^B5nf(XaoQaZN?Xi}Rn3 z$8&m*KmWvPaUQ(V<#J+S&zO|8P-#!f%7G+n_%sXp9=J%Z4&9OkWXeuZN}ssgQ#Tcj z8p6ErJQJWZ+fXLCco=RN8D{W%+*kko*2-LEb))xcHwNl~Xmir>kmAxW?eW50Osw3# zki8Fl$#fvw*7rqd?%E?}ZX4`c5-R&w!Y0#EBbelVXSng+kUfeUiqofPehl}$ormli zg%r)}?%=?_pHb9`Cq9Z|B`L8b>(!+8HSX?`5+5mm81AFXfnAt1*R3F z%b2RPIacKAddx%JfQ8l{3U|vK@W7KB$CdLqn@wP^?azRks@x8z59#$Q*7q!KilY-P zHUbs(IFYRGG1{~@RF;Lqyho$~7^hNC`NL3kn^Td%A7dRgr_&`2k=t+}D-o9&C!y^? z6MsQ=tc3g0xkK(O%DzR9nbNB(r@L;1zQrs8mzx&4dz}?3KNYozOW5;=w18U6$G4U2 z#2^qRLT*Mo4bV1Oeo1PKQ2WQS2Y-hv&S|C7`xh6=Pj7MNLC5K-zokZ67S)C;(F0Dd zloDK2_o1$Fmza>EMj3X9je7e%Q`$39Dk~GoOj89-6q9|_WJlSl!!+*{R=tGp z8u|MuSwm^t7K^nUe+^0G3dkGZr3@(X+TL5eah)K^Tn zXEtHmR9UIaEYgD5Nhh(s*fcG_lh-mfy5iUF3xxpRZ0q3nZ=1qAtUa?(LnT9I&~uxX z`pV?+=|-Gl(kz?w!zIieXT}o}7@`QO>;u$Z!QB${a08_bW0_o@&9cjJUXzVyNGCm8 zm=W+$H!;_Kzp6WQqxUI;JlPY&`V}9C$8HZ^m?NvI*JT@~BM=()T()Ii#+*$y@lTZBkmMMda>7s#O(1YZR+zTG@&}!EXFG{ zEWPSDI5bFi;NT>Yj*FjH((=oe%t%xYmE~AGaOc4#9K_XsVpl<4SP@E!TgC0qpe1oi zNpxU2b0(lEMcoibQ-G^cxO?ySVW26HoBNa;n0}CWL*{k)oBu1>F18X061$SP{Gu67 z-v-Fa=Fl^u3lnGY^o5v)Bux}bNZ~ z5pL+7F_Esoun8^5>z8NFoIdb$sNS&xT8_|`GTe8zSXQzs4r^g0kZjg(b0bJvz`g<70u9Z3fQILX1Lj@;@+##bP|FAOl)U^9U>0rx zGi)M1(Hce)LAvQO-pW!MN$;#ZMX?VE(22lTlJrk#pB0FJNqVwC+*%${Gt#r_tH9I_ z;+#)#8cWAl?d@R+O+}@1A^hAR1s3UcW{G+>;X4utD2d9X(jF555}!TVN-hByV6t+A zdFR^aE@GNNgSxxixS2p=on4(+*+f<8xrwAObC)D5)4!z7)}mTpb7&ofF3u&9&wPS< zB62WHLGMhmrmOAgmJ+|c>qEWTD#jd~lHNgT0?t-p{T=~#EMcB| z=AoDKOL+qXCfk~F)-Rv**V}}gWFl>liXOl7Uec_8v)(S#av99PX1sQIVZ9eNLkhq$ zt|qu0b?GW_uo}TbU8!jYn8iJeIP)r@;!Ze_7mj{AUV$GEz6bDSDO=D!&C9!M@*S2! zfGyA|EPlXGMjkH6x7OMF?gKL7{GvGfED=Jte^p=91FpCu)#{whAMw`vSLa`K#atdN zThnL+7!ZNmP{rc=Z>%$meH;Qi1=m1E3Lq2D_O1-X5C;!I0L>zur@tPAC9*7Jeh)`;eec}1`nkRP(%iv-`N zZ@ip-g|7l6Hz%j%gcAM}6-nrC8oA$BkOTz^?dakvX?`^=ZkYh%vUE z9+&)K1UTK=ahYiaNn&G5nHUY5niLGus@p5E2@RwZufRvF{@$hW{;{3QhjvEHMvduO z#Wf-@oYU4ht?#uP{N3utVzV49mEc9>*TV_W2TVC`6+oI)zAjy$KJrr=*q##&kobiQ z1vNbya&OVjK`2pdRrM?LuK6BgrLN7H_3m z!qpNKg~87XgCwb#I=Q&0rI*l$wM!qTkXrx1ko5q-f;=R2fImRMwt5Qs{P*p^z@9ex z`2#v(qE&F%MXlHpdO#QEZyZftn4f05ab^f2vjxuFaat2}jke{j?5GrF=WYBR?gS(^ z9SBiNi}anzBDBRc+QqizTTQuJrzm^bNA~A{j%ugXP7McZqJ}65l10({wk++$=e8O{ zxWjG!Qp#5OmI#XRQQM?n6?1ztl6^D40hDJr?4$Wc&O_{*OfMfxe)V0=e{|N?J#fgE>j9jAajze$iN!*yeF%jJU#G1c@@rm zolGW!j?W6Q8pP=lkctNFdfgUMg92wlM4E$aks1??M$~WQfzzzXtS)wKrr2sJeCN4X zY(X^H_c^PzfcO8Bq(Q*p4c_v@F$Y8cHLrH$`pJ2}=#*8%JYdqsqnGqEdBQMpl!Ot04tUGSXTQdsX&GDtjbWD=prcCT9(+ z&UM%lW%Q3yrl1yiYs;LxzIy>2G}EPY6|sBhL&X&RAQrSAV4Tlh2nITR?{6xO9ujGu zr*)^E`>o!c=gT*_@6S&>0POxcXYNQd&HMw6<|#{eSute2C3{&h?Ah|cw56-AP^f8l zT^kvZY$YiH8j)sk7_=;gx)vx-PW`hbSBXJGCTkpt;ap(}G2GY=2bbjABU5)ty%G#x zAi07{Bjhv}>OD#5zh#$0w;-vvC@^}F! z#X$@)zIs1L^E;2xDAwEjaXhTBw2<{&JkF*`;c3<1U@A4MaLPe{M5DGGkL}#{cHL%* zYMG+-Fm0#qzPL#V)TvQVI|?_M>=zVJr9>(6ib*#z8q@mYKXDP`k&A4A};xMK0h=yrMp~JW{L?mE~ph&1Y1a#4%SO)@{ zK2juwynUOC)U*hVlJU17%llUxAJFuKZh3K0gU`aP)pc~bE~mM!i1mi!~LTf>1Wp< zuG+ahp^gH8g8-M$u{HUWh0m^9Rg@cQ{&DAO{PTMudV6c?ka7+AO& z746QylZ&Oj`1aqfu?l&zGtJnpEQOt;OAFq19MXTcI~`ZcoZmyMrIKDFRIDi`FH)w; z8+*8tdevMDv*VtQi|e}CnB_JWs>fhLOH-+Os2Lh!&)Oh2utl{*AwR)QVLS49iTp{6 z;|172Jl!Ml17unF+pd+Ff@jIE-{Oxv)5|pOm@CkHW?{l}b@1>Pe!l}VccX#xp@xgJ zyE<&ep$=*vT=}7vtvif0B?9xw_3Gej7mN*dOHdQPtW5kA5_zGD zpA4tV2*0E^OUimSsV#?Tg#oiQ>%4D@1F5@AHwT8Kgen$bSMHD3sXCkq8^(uo7CWk`mT zuslYq`6Yz;L%wJh$3l1%SZv#QnG3=NZ=BK4yzk#HAPbqXa92;3K5?0kn4TQ`%E%X} z&>Lbt!!QclYKd6+J7Nl@xv!uD%)*bY-;p`y^ZCC<%LEHUi$l5biu!sT3TGGSTPA21 zT8@B&a0lJHVn1I$I3I1I{W9fJAYc+8 zVj8>HvD}&O`TqU2AAb={?eT;0hyL(R{|h23=4fDSZKC32;wWxsVj`P z3J3{M$PwdH!ro*Cn!D&=jnFR>BNGR<<|I8CI@+@658Dy(lhqbhXfPTVecY@L8%`3Q z1Fux2w?2C3th60jI~%OC9BtpNF$QPqcG+Pz96qZJ71_`0o0w_q7|h&O>`6U+^BA&5 zXd5Zp1Xkw~>M%RixTm&OqpNl8Q+ue=92Op_>T~_9UON?ZM2c0aGm=^A4ejrXj3dV9 zhh_bCt-b9`uOX#cFLj!vhZ#lS8Tc47OH>*)y#{O9?AT~KR9LntM|#l#Dlm^8{nZdk zjMl#>ZM%#^nK2TPzLcKxqx24P7R1FPlBy7LSBrRvx>fE$9AJ;7{PQm~^LBX^k#6Zq zw*Z(zJC|`!6_)EFR}8|n8&&Rbj8y028~P~sFXBFRt+tmqH-S3<%N;C&WGH!f3{7cm zy_fCAb9@HqaXa1Y5vFbxWf%#zg6SI$C+Uz5=CTO}e|2fjWkZ;Dx|84Ow~bkI=LW+U zuq;KSv9VMboRvs9)}2PAO|b(JCEC_A0wq{uEj|3x@}*=bOd zwr{TgeCGG>HT<@Zeq8y}vTpwDg#UBvD)BEs@1KP$^3$sh&_joQPn{hjBXmLPJ{tC) z*HS`*2+VtJO{|e$mM^|qv1R*8i(m1`%)}g=SU#T#0KlTM2RSvYUc1fP+va|4;5}Bfz98UvDCpq7}+SMV&;nX zQw~N6qOX{P55{#LQkrZk(e5YGzr|(B;Q;ju;2a`q+S9bsEH@i1{_Y0;hWYn1-79jl z5c&bytD*k)GqrVcHn6t-7kinadiD>B{Tl`ZY@`g|b~pvHh5!gKP4({rp?D0aFd_cN zhHRo4dd5^S6ViN(>(28qZT6E>??aRhc($kP`>@<+lIKS5HdhjVU;>f7<4))E*5|g{ z&d1}D|vpuV^eRj5j|xx9nwaCxXFG?Qbjn~_WSy=N}P0W>MP zG-F%70lX5Xr$a)2i6?i|iMyM|;Jtf*hO?=Jxj12oz&>P=1#h~lf%#fc73M2_(SUM- zf&qnjS80|_Y0lDgl&I?*eMumUklLe_=Td!9G@eR*tcPOgIShJipp3{A10u(4eT~DY zHezEj8V+7m!knn7)W!-5QI3=IvC^as5+TW1@Ern@yX| z7Nn~xVx&fGSr+L%4iohtS3w^{-H1A_5=r&x8}R!YZvp<2T^YFvj8G_vm}5q;^UOJf ztl=X3iL;;^^a#`t{Ae-%5Oq{?M#s6Npj+L(n-*LMI-yMR{)qki!~{5z{&`-iL}lgW zxo+tnvICK=lImjV$Z|O_cYj_PlEYCzu-XBz&XC-JVxUh9;6*z4fuBG+H{voCC;`~GYV|hj%j_&I zDZCj>Q_0RCwFauYoVMiUSB+*Mx`tg)bWmM^SwMA+?lBg12QUF_x2b)b?qb88K-YUd z0dO}3k#QirBV<5%jL$#wlf!60dizu;tsp(7XLdI=eQs?P`tOZYMjVq&jE)qK*6B^$ zBe>VvH5TO>s>izhwJJ$<`a8fakTL!yM^Zfr2hV9`f}}VVUXK39p@G|xYRz{fTI+Yq z20d=)iwjuG9RB$%$^&8#(c0_j0t_C~^|n+c`Apu|x7~;#cS-s=X1|C*YxX3ailhg_|0`g!E&GZJEr?bh#Tpb8siR=JxWKc{#w7g zWznLwi;zLFmM1g8V5-P#RsM@iX>TK$xsWuujcsVR^7TQ@!+vCD<>Bk9tdCo7Mzgq5 zv8d>dK9x8C@Qoh01u@3h0X_`SZluTb@5o;{4{{eF!-4405x8X7hewZWpz z2qEi4UTiXTvsa(0X7kQH{3VMF>W|6;6iTrrYD2fMggFA&-CBEfSqPlQDxqsa>{e2M z(R5PJ7uOooFc|9GU0ELA%m4&4Ja#cQpNw8i8ACAoK6?-px+oBl_yKmenZut#Xumjz zk8p^OV2KY&?5MUwGrBOo?ki`Sxo#?-Q4gw*Sh0k`@ zFTaYK2;}%Zk-68`#5DXU$2#=%YL#S&MTN8bF+!J2VT6x^XBci6O)Q#JfW{YMz) zOBM>t2rSj)n#0a3cjvu}r|k3od6W(SN}V-cL?bi*Iz-8uOcCcsX0L>ZXjLqk zZu2uHq5B|Kt>e+=pPKu=1P@1r9WLgYFq_TNV1p9pu0erHGd!+bBp!qGi+~4A(RsYN@CyXNrC&hxGmW)u5m35OmWwX`I+0yByglO`}HC4nGE^_HUs^&A(uaM zKPj^=qI{&ayOq#z=p&pnx@@k&I1JI>cttJcu@Ihljt?6p^6{|ds`0MoQwp+I{3l6` zB<9S((RpLG^>=Kic`1LnhpW2=Gu!x`m~=y;A`Qk!-w`IN;S8S930#vBVMv2vCKi}u z6<-VPrU0AnE&vzwV(CFC0gnZYcpa-l5T0ZS$P6(?9AM;`Aj~XDvt;Jua=jIgF=Fm? zdp=M$>`phx%+Gu};;-&7T|B1AcC#L4@mW5SV_^1BRbo6;2PWe$r+npRV`yc;T1mo& z+~_?7rA+(Um&o@Tddl zL_hxvWk~a)yY}%j`Y+200D%9$bWHy&;(yj{jpi?Rtz{J66ANw)UyPOm;t6FzY3$hx zcn)Ir79nhFvNa7^a{SHN7XH*|Vlsx`CddPnA&Qvh8aNhEA;mPVv;Ah=k<*u!Zq^7 z<=xs*iQTQOMMcg|(NA_auh@x`3#_LFt=)}%SQppP{E>mu_LgquAWvh<>L7tf9+~rO znwUDS52u)OtY<~!d$;m9+87aO+&`#2ICl@Y>&F{jI=H(K+@3M1$rr=*H^dye#~TyD z!){#Pyfn+|ugUu}G;a~!&&0aqQ59U@UT3|_JuBlYUpT$2+11;}JBJ`{+lQN9T@QFY z5+`t;6(TS0F?OlBTE!@7D`8#URDNqx2t6`GZ{ZgXeS@v%-eJzZOHz18aS|svxII$a zZeFjrJ*$IwX$f-Rzr_G>xbu@euGl)B7pC&S+CmDJBg$BoV~jxSO#>y z33`bupN#LDoW0feZe0%q8un0rYN|eRAnwDHQ6e_)xBTbtoZtTA=Fvk){q}9Os~6mQ zKB80VI_&6iSq`LnK7*kfHZoeX6?WE}8yjuDn=2#JG$+;-TOA1%^=DnXx%w{b=w}tS zQbU3XxtOI8E(!%`64r2`zog;5<0b4i)xBmGP^jiDZ2%HNSxIf3@wKs~uk4%3Mxz;~ zts_S~E4>W+YwI<-*-$U8*^HKDEa8oLbmqGg?3vewnaNg%Mm)W=)lcC_J+1ov^u*N3 zXJ?!BrH-+wGYziJq2Y#vyry6Z>NPgkEk+Ke`^DvNRdb>Q2Nlr#v%O@<5hbflI6EKE z9dWc0-ORk^T}jP!nkJ1imyjdVX@GrjOs%cpgA8-c&FH&$(4od#x6Y&=LiJZPINVyW z0snY$8JW@>tc2}DlrD3StQmA0Twck~@>8dSix9CyQOALcREdxoM$Sw*l!}bXKq9&r zysMWR@%OY24@e`?+#xV2bk{T^C_xSo8v2ZI=lBI*l{RciPwuE>L5@uhz@{!l)rtVlWC>)6(G)1~n=Q|S!{E9~6*fdpa*n z!()-8EpTdj=zr_Lswi;#{TxbtH$8*G=UM`I+icz7sr_SdnHXrv=?iEOF1UL+*6O;% zPw>t^kbW9X@oEXx<97%lBm-9?O_7L!DeD)Me#rwE54t~UBu9VZ zl_I1tBB~>jm@bw0Aljz8! zXBB6ATG6iByKIxs!qr%pz%wgqbg(l{65DP4#v(vqhhL{0b#0C8mq`bnqZ1OwFV z7mlZZJFMACm>h9v^2J9+^_zc1=JjL#qM5ZHaThH&n zXPTsR8(+)cj&>Un{6v*z?@VTLr{TmZ@-fY%*o2G}*G}#!bmqpoo*Ay@U!JI^Q@7gj;Kg-HIrLj4}#ec4~D2~X6vo;ghep-@&yOivYP zC19L0D`jjKy1Yi-SGPAn94(768Tcf$urAf{)1)9W58P`6MA{YG%O?|07!g9(b`8PXG1B1Sh0?HQmeJtP0M$O$hI z{5G`&9XzYhh|y@qsF1GnHN|~^ru~HVf#)lOTSrv=S@DyR$UKQk zjdEPFDz{uHM&UM;=mG!xKvp;xAGHOBo~>_=WFTmh$chpC7c`~7?36h)7$fF~Ii}8q zF|YXxH-Z?d+Q+27Rs3X9S&K3N+)OBxMHn1u(vlrUC6ckBY@@jl+mgr#KQUKo#VeFm zFwNYgv0<%~Wn}KeLeD9e1$S>jhOq&(e*I@L<=I5b(?G(zpqI*WBqf|Zge0&aoDUsC zngMRA_Kt0>La+Erl=Uv_J^p(z=!?XHpenzn$%EA`JIq#yYF?JLDMYiPfM(&Csr#f{ zdd+LJL1by?xz|D8+(fgzRs~(N1k9DSyK@LJygwaYX8dZl0W!I&c^K?7)z{2is;OkE zd$VK-(uH#AUaZrp=1z;O*n=b?QJkxu`Xsw&7yrX0?(CX=I-C#T;yi8a<{E~?vr3W> zQrpPqOW2M+AnZ&p{hqmHZU-;Q(7?- zP8L|Q0RM~sB0w1w53f&Kd*y}ofx@c z5Y6B8qGel+uT1JMot$nT1!Tim6{>oZzJXdyA+4euOLME?5Fd_85Uk%#E*ln%y{u8Q z$|?|R@Hpb~yTVK-Yr_S#%NUy7EBfYGAg>b({J|5b+j-PBpPy$Ns`PaJin4JdRfOaS zE|<HjH%NuJgsd2wOlv>~y=np%=2)$M9LS|>P)zJ+Fei5vYo_N~B0XCn+GM76 z)Xz3tg*FRVFgIl9zpESgdpWAavvVViGlU8|UFY{{gVJskg*I!ZjWyk~OW-Td4(mZ6 zB&SQreAAMqwp}rjy`HsG({l2&q5Y52<@AULVAu~rWI$UbFuZs>Sc*x+XI<+ez%$U)|a^unjpiW0l0 zj1!K0(b6$8LOjzRqQ~K&dfbMIE=TF}XFAi)$+h}5SD3lo z%%Qd>p9se=VtQG{kQ;N`sI)G^u|DN#7{aoEd zkksYP%_X$Rq08);-s6o>CGJ<}v`qs%eYf+J%DQ^2k68C%nvikRsN?$ap--f+vCS`K z#&~)f7!N^;sdUXu54gl3L=LN>FB^tuK=y2e#|hWiWUls__n@L|>xH{%8lIJTd5`w? zSwZbnS;W~DawT4OwSJVdAylbY+u5S+ZH{4hAi2&}Iv~W(UvHg(1GTZRPz`@{SOqzy z(8g&Dz=$PfRV=6FgxN~zo+G8OoPI&d-thcGVR*_^(R8COTM@bq?fDwY{}WhsQS1AK zF6R1t8!RdFmfocpJ6?9Yv~;WYi~XPgs(|>{5})j!AR!voO7y9&cMPo#80A(`za@t>cx<0;qxM@S*m(jYP)dMXr*?q0E`oL;12}VAep179uEr8c<=D zr5?A*C{eJ`z9Ee;E$8)MECqatHkbHH z&Y+ho0B$31MIB-xm&;xyaFCtg<{m~M-QDbY)fQ>Q*Xibb~8ytxZQ?QMf9!%cV zU0_X1@b4d+Pg#R!`OJ~DOrQz3@cpiGy~XSKjZQQ|^4J1puvwKeScrH8o{bscBsowomu z^f12kTvje`yEI3eEXDHJ6L+O{Jv$HVj%IKb|J{IvD*l6IG8WUgDJ*UGz z3!C%>?=dlfSJ>4U88)V+`U-!9r^@AxJBx8R;)J4Fn@`~k>8>v0M9xp90OJElWP&R5 zM#v*vtT}*Gm1^)Bv!s72T3PB0yVIjJW)H7a)ilkAvoaH?)jjb`MP>2z{%Y?}83 zUIwBKn`-MSg)=?R)1Q0z3b>dHE^)D8LFs}6ASG1|daDly_^lOSy&zIIhm*HXm1?VS=_iacG);_I9c zUQH1>i#*?oPIwBMJkzi_*>HoUe}_4o>2(SHWzqQ=;TyhAHS;Enr7!#8;sdlty&(>d zl%5cjri8`2X^Ds`jnw7>A`X|bl=U8n+3LKLy(1dAu8`g@9=5iw$R0qk)w8Vh_Dt^U zIglK}sn^)W7aB(Q>HvrX=rxB z+*L)3DiqpQ_%~|m=44LcD4-bxO3OO*LPjsh%p(k?&jvLp0py57oMH|*IMa(<|{m1(0S|x)?R-mqJ=I;_YUZA>J z62v*eSK;5w!h8J+6Z2~oyGdZ68waWfy09?4fU&m7%u~zi?YPHPgK6LDwphgaYu%0j zurtw)AYOpYKgHBrkX189mlJ`q)w-f|6>IER{5Lk97%P~a-JyCRFjejW@L>n4vt6#hq;!|m;hNE||LK3nw1{bJOy+eBJjK=QqNjI;Q6;Rp5 z&035pZDUZ#%Oa;&_7x0T<7!RW`#YBOj}F380Bq?MjjEhrvlCATPdkCTTl+2efTX$k zH&0zR1n^`C3ef~^sXzJK-)52(T}uTG%OF8yDhT76L~|^+hZ2hiSM*QA9*D5odI1>& z9kV9jC~twA5MwyOx(lsGD_ggYmztXPD`2=_V|ks_FOx!_J8!zM zTzh^cc+=VNZ&(OdN=y4Juw)@8-85lwf_#VMN!Ed(eQiRiLB2^2e`4dp286h@v@`O%_b)Y~A; zv}r6U?zs&@uD_+(_4bwoy7*uozNvp?bXFoB8?l8yG0qsm1JYzIvB_OH4_2G*IIOwT zVl%HX1562vLVcxM_RG*~w_`FbIc!(T=3>r528#%mwwMK}uEhJ()3MEby zQQjzqjWkwfI~;Fuj(Lj=Ug0y`>~C7`w&wzjK(rPw+Hpd~EvQ-ufQOiB4OMpyUKJhw zqEt~jle9d7S~LI~$6Z->J~QJ{Vdn3!c}g9}*KG^Kzr^(7VI5Gk(mHLL{itj_hG?&K4Ws0+T4gLfi3eu$N=`s36geNC?c zm!~}vG6lx9Uf^5M;bWntF<-{p^bruy~f?sk9 zcETAPQZLoJ8JzMMg<-=ju4keY@SY%Wo?u9Gx=j&dfa6LIAB|IrbORLV1-H==Z1zCM zeZcOYpm5>U2fU7V*h;%n`8 zN95QhfD994={1*<2vKLCNF)feKOGk`R#K~G=;rfq}|)s20&MCa65 zUM?xF5!&e0lF%|U!#rD@I{~OsS_?=;s_MQ_b_s=PuWdC)q|UQ&ea)DMRh5>fpQjXe z%9#*x=7{iRCtBKT#H>#v%>77|{4_slZ)XCY{s3j_r{tdpvb#|r|sbS^dU1x70$eJMU!h{Y7Kd{dl}9&vxQl6Jt1a` zHQZrWyY0?!vqf@u-fxU_@+}u(%Wm>0I#KP48tiAPYY!TdW(o|KtVI|EUB9V`CBBNaBLVih7+yMVF|GSoIQD0Jfb{ z!OXq;(>Z?O`1gap(L~bUcp>Lc@Jl-})^=6P%<~~9ywY=$iu8pJ0m*hOPzr~q`23eX zgbs;VOxxENe0UMVeN*>uCn9Gk!4siN-e>x)pIKAbQz!G)TcqIJ0`JBBaX>1-4_XO_-HCS^vr2vjv#7KltDZdyQ{tlWh4$Gm zB>|O1cBDC)yG(sbnc*@w6e%e}r*|IhpXckx&;sQCwGdKH+3oSG-2)Bf#x`@<4ETAr z0My%7RFh6ZLiZ_;X6Mu1YmXx7C$lSZ^}1h;j`EZd6@%JNUe=btBE z%s=Xmo1Ps?8G`}9+6>iaB8bgjUdXT?=trMu|4yLX^m0Dg{m7rpKNJey|EwHI+nN1e zL^>qN%5Fg)dGs4DO~uwIdXImN)QJ*Jhpj7$fq_^`{3fwpztL@WBB}OwQ#Epo-mqMO zsM$UgpFiG&d#)lzEQ{3Q;)&zTw;SzGOah-Dpm{!q7<8*)Ti_;xvV2TYXa}=faXZy? z3y?~GY@kl)>G&EvEijk9y1S`*=zBJSB1iet>0;x1Ai)*`^{pj0JMs)KAM=@UyOGtO z3y0BouW$N&TnwU6!%zS%nIrnANvZF&vB1~P5_d`x-giHuG zPJ;>XkVoghm#kZXRf>qxxEix;2;D1CC~NrbO6NBX!`&_$iXwP~P*c($EVV|669kDO zKoTLZNF4Cskh!Jz5ga9uZ`3o%7Pv`d^;a=cXI|>y;zC3rYPFLQkF*nv(r>SQvD*## z(Vo%^9g`%XwS0t#94zPq;mYGLKu4LU3;txF26?V~A0xZbU4Lmy`)>SoQX^m7fd^*E z+%{R4eN!rIk~K)M&UEzxp9dbY;_I^c} zOc{wlIrN_P(PPqi51k_$>Lt|X6A^|CGYgKAmoI#Li?;Wq%q~q*L7ehZkUrMxW67Jl zhsb~+U?33QS>eqyN{(odAkbopo=Q$Az?L+NZW>j;#~@wCDX?=L5SI|OxI~7!Pli;e zELMFcZtJY3!|=Gr2L4>z8yQ-{To>(f80*#;6`4IAiqUw`=Pg$%C?#1 z_g@hIGerILSU>=P>z{gM|DS91A4cT@PEIB^hSop!uhMo#2G;+tQSpDO_6nOnPWSLU zS;a9m^DFMXR4?*X=}d7l;nXuHk&0|m`NQn%d?8|Ab3A9l9Jh5s120ibWBdB z$5YwsK3;wvp!Kn@)Qae{ef`0#NwlRpQ}k^r>yos_Ne1;xyKLO?4)t_G4eK~wkUS2A&@_;)K0-03XGBzU+5f+uMDxC z(s8!8!RvdC#@`~fx$r)TKdLD6fWEVdEYtV#{ncT-ZMX~eI#UeQ-+H(Z43vVn%Yj9X zLdu9>o%wnWdvzA-#d6Z~vzj-}V3FQ5;axDIZ;i(95IIU=GQ4WuU{tl-{gk!5{l4_d zvvb&uE{%!iFwpymz{wh?bKr1*qzeZb5f6e6m_ozRF&zux2mlK=v_(_s^R6b5lu?_W4W3#<$zeG~Pd)^!4tzhs}-Sx$FJP>)ZGF(hVTH|C3(U zs0PO&*h_ zNA-&qZpTP$$LtIgfiCn07}XDbK#HIXdmv8zdz4TY;ifNIH-0jy(gMSByG2EF~Th#eb_TueZC` zE?3I>UTMpKQ})=C;6p!?G)M6w^u*A57bD?2X`m3X^6;&4%i_m(uGJ3Z5h`nwxM<)H z$I5m?wN>O~8`BGnZ=y^p6;0+%_0K}Dcg|K;+fEi|qoBqvHj(M&aHGqNF48~XqhtU? z^ogwBzRlOfpAJ+Rw7IED8lRbTdBdyEK$gPUpUG}j-M42xDj_&qEAQEtbs>D#dRd7Y z<&TpSZ(quQDHiCFn&0xsrz~4`4tz!CdL8m~HxZM_agu@IrBpyeL1Ft}V$HX_ZqDPm z-f89)pjuEzGdq-PRu`b1m+qBGY{zr_>{6Ss>F|xHZlJj9dt5HD$u`1*WZe)qEIuDSR)%z+|n zatVlhQ?$w#XRS7xUrFE;Y8vMGhQS5*T{ZnY=q1P?w5g$OKJ#M&e??tAmPWHMj3xhS ziGxapy?kn@$~2%ZY;M8Bc@%$pkl%Rvj!?o%agBvpQ-Q61n9kznC4ttrRNQ4%GFR5u zyv%Yo9~yxQJWJSfj z?#HY$y=O~F|2pZs22pu|_&Ajd+D(Mt!nPUG{|1nlvP`=R#kKH zO*s$r_%ss5h1YO7k0bHJ2CXN)Yd6CHn~W!R=SqkWe=&nAZu(Q1G!xgcUilM@YVei@2@a`8he z9@pM`)VB*=e7-MWgLlXlc)t;fF&-AwM{E-EX}pViFn0I0CNw2bNEnN2dj!^4(^zS3 zobUm1uQnpqk_4q{pl*n06=TfK_C>UgurKFjRXsK_LEn};=79`TB12tv6KzwSu*-C8 z;=~ohDLZylHQ|Mpx-?yql>|e=vI1Z!epyUpAcDCp4T|*RV&X`Q$0ogNwy6mFALo^@ z9=&(9txO8V@E!@6^(W0{*~CT>+-MA~vnJULBxCTUW>X5>r7*eXYUT0B6+w@lzw%n> z_VjJ<2qf|(d6jYq2(x$(ZDf!yVkfnbvNmb5c|hhZ^2TV_LBz`9w!e_V*W_(MiA7|= z&EeIIkw*+$Xd!)j8<@_<}A5;~A_>3JT*kX^@}cDoLd>Qj<`Se^wdUa(j0dp+Tl8EptwBm{9OGsdFEq zM`!pjf(Lm(`$e3FLOjqA5LnN5o!}z{ zNf}rJuZh@yUtq&ErjHeGzX4(!luV!jB&;FAP|!R_QHYw#^Z1LwTePAKJ6X&IDNO#; z)#I@Xnnzyij~C@UH~X51JCgQeF0&hTXnuoElz#m{heZRexWc0k4<>0+ClX7%0 zEBqCCld1tD9Zwkr4{?Nor19#E5-YKfB8d?qgR82-Ow2^AuNevly2*tHA|sK!ybYkX zm-sLQH72P&{vEAW6+z~O5d0qd=xW~rua~5a?ymYFSD@8&gV)E5@RNNBAj^C99+Z5Z zR@Pq55mbCQbz+Mn$d_CMW<-+?TU960agEk1J<>d>0K=pF19yN))a~4>m^G&tc*xR+yMD*S=yip-q=H zIlredHpsJV8H(32@Zxc@bX6a21dUV95Th--8pE6C&3F>pk=yv$yd6@Haw;$v4+Fcb zRwn{Qo@0`7aPa2LQOP}j9v>sjOo5Kqvn|`FLizX zB+@-u4Lw|jsvz{p^>n8Vo8H2peIqJJnMN}A)q6%$Tmig7eu^}K2 zrh$X?T|ZMsoh{6pdw1G$_T<`Ds-G=jc;qcGdK4{?dN2-XxjDNbb(7pk|3JUVCU4y; z)?LXR>f+AAu)JEiti_Zy#z5{RgsC}R(@jl%9YZ>zu~hKQ*AxbvhC378-I@{~#%Y`Z zy=a=9YpewPIC+gkEUUwtUL7|RU7=!^Aa}Mk^6uxOgRGA#JXjWLsjFUnix|Mau{hDT z7mn*z1m5g`vP(#tjT0Zy4eAY(br&!RiiXE=ZI!{sE1#^#%x^Z7t1U)b<;%Y}Q9=5v z;wpDCEZ@OE36TWT=|gxigT@VaW9BvHS05;_P(#s z8zI4XFQys}q)<`tkX$WnSarn{3e!s}4(J!=Yf>+Y>cP3f;vr63f2{|S^`_pWc)^5_!R z*(x-fuBxL51@xe!lnDBKi}Br$c$BMZ3%f2Sa6kLabiBS{pq*yj;q|k(86x`PiC{p6 z_bxCW{>Q2BA8~Ggz&0jkrcU+-$ANBsOop*ms>34K9lNYil@}jC;?cYP(m^P}nR6FV zk(M%48Z&%2Rx$A&FhOEirEhY0(dn;-k(qkTU)sFQ`+-ih+s@A8g?r8Pw+}2;35WYf zi}VO`jS`p(tc)$X$a>-#WXoW!phhatC*$}|rk>|wUU71eUJG^$c6_jwX?iSHM@6__ zvV|6%U*$sSXJu9SX?2%M^kK|}a2QJ8AhF{fuXrHZxXsI~O zGKX45!K7p*MCPEQ=gp?eu&#AW*pR{lhQR##P_*{c_DjMGL|3T3-bSJ(o$|M{ytU}> zAV>wq*uE*qFo9KvnA^@juy{x<-u*#2NvkV={Ly}ysKYB-k`K3@K#^S1Bb$8Y#0L0# z`6IkSG&|Z$ODy|VLS+y5pFJx&8tvPmMd8c9FhCyiU8~k6FwkakUd^(_ml8`rnl>JS zZV){9G*)xBqPz^LDqRwyS6w86#D^~xP4($150M)SOZRe9sn=>V#aG0Iy(_^YcPpIz8QYM-#s+n% z@Jd?xQq?Xk6=<3xSY7XYP$$yd&Spu{A#uafiIfy8gRC`o0nk{ezEDjb=q_qRAlR1d zFq^*9Gn)yTG4b}R{!+3hWQ+u3GT~8nwl2S1lpw`s0X_qpxv)g+JIkVKl${sYf_nV~B>Em>M;RlqGb5WVil(89 zs=ld@|#;dq1*vQGz=7--Br-|l) zZ%Xh@v8>B7P?~}?Cg$q9_={59l%m~O&*a6TKsCMAzG&vD>k2WDzJ6!tc!V)+oxF;h zJH;apM=wO?r_+*#;ulohuP=E>^zon}a$NnlcQ{1$SO*i=jnGVcQa^>QOILc)e6;eNTI>os=eaJ{*^DE+~jc zS}TYeOykDmJ=6O%>m`i*>&pO_S;qMySJIyP=}4E&J%#1zju$RpVAkZbEl+p%?ZP^C z*$$2b4t%a(e+%>a>d_f_<JjxI#J1x;=hPd1zFPx=6T$;;X1TD*2(edZ3f46zaAoW>L53vS_J*N8TMB|n+;LD| zC=GkQPpyDY#Am4l49chDv*gojhRj_?63&&8#doW`INATAo(qY#{q}%nf@eTIXmtU< zdB<7YWfyCmBs|c)cK>1)v&M#!yNj#4d$~pVfDWQc_ke1?fw{T1Nce_b`v|Vp5ig(H zJvRD^+ps46^hLX;=e2!2e;w9y1D@!D$c@Jc&%%%IL=+xzw55&2?darw=9g~>P z9>?Kdc$r?6c$m%x2S$sdpPl>GQZ{rC9mPS63*qjCVa?OIBj!fW zm|g?>CVfGXNjOfcyqImXR_(tXS(F{FcoNzKvG5R$IgGaxC@)i(e+$ME}vPVIhd|mx2IIE+f zM?9opQHIVgBWu)^A|RzXw!^??S!x)SZOwZaJkGjc<_}2l^eSBm!eAJG9T>EC6I_sy z?bxzDIAn&K5*mX)$RQzDA?s)-no-XF(g*yl4%+GBf`##bDXJ==AQk*xmnatI;SsLp zP9XTHq5mmS=iWu~9ES>b%Q=1aMa|ya^vj$@qz9S!ih{T8_PD%Sf_QrNKwgrXw9ldm zHRVR98*{C?_XNpJn{abA!oix_mowRMu^2lV-LPi;0+?-F(>^5#OHX-fPED zCu^l7u3E%STI}c4{J2!)9SUlGP_@!d?5W^QJXOI-Ea`hFMKjR7TluLvzC-ozCPn1`Tpy z!vlv@_Z58ILX6>nDjTp-1LlFMx~-%GA`aJvG$?8*Ihn;mH37eK**rmOEwqegf-Ccx zrIX4;{c~RK>XuTXxYo5kMiWMy)!IC{*DHG@E$hx?RwP@+wuad(P1{@%tRkyJRqD)3 zMHHHZ4boqDn>-=DgR5VlhQTpfVy182Gk;A_S8A1-;U1RR>+$62>(MUx@Nox$vTjHq z%QR=j!6Gdyb5wu7y(YUktwMuW5<@jl?m4cv4BODiT5o8qVdC0MBqGr@-YBIwnpZAY znX9(_uQjP}JJ=!~Ve9#5I~rUnN|P_3D$LqZcvBnywYhjlMSFHm`;u9GPla{5QD7(7*6Tb3Svr8;(nuAd81q$*uq6HC_&~je*Ca7hP4sJp0av{M8480wF zxASi7Qv+~@2U%Nu1Ud;s-G4CTVWIPyx!sg&8ZG0Wq zG_}i3C(6_1>q3w!EH7$Kwq8uBp2F2N7}l65mk1p*9v0&+;th=_E-W)E;w}P(j⁢ zv5o9#E7!G0XmdzfsS{efPNi`1b44~SZ4Z8fuX!I}#8g+(wxzQwUT#Xb2(tbY1+EUhGKoT@KEU9Ktl>_0 z%bjDJg;#*gtJZv!-Zs`?^}v5eKmnbjqlvnSzE@_SP|LG_PJ6CYU+6zY6>92%E+ z=j@TZf-iW4(%U{lnYxQA;7Q!b;^brF8n0D>)`q5>|WDDXLrqYU_tKN2>=#@~OE7grMnNh?UOz-O~6 z6%rHy{#h9K0AT+lDC7q4{hw^|q6*Ry;;L%Q@)Ga}$60_q%D)rv(CtS$CQbpq9|y1e zRSrN4;$Jyl{m5bZw`$8TGvb}(LpY{-cQ)fcyJv7l3S52TLXVDsphtv&aPuDk1OzCA z4A^QtC(!11`IsNx_HnSy?>EKpHJWT^wmS~hc^p^zIIh@9f6U@I2 zC=Mve{j2^)mS#U$e{@Q?SO6%LDsXz@SY+=cK_QMmXBIU)j!$ajc-zLx3V60EXJ!qC zi<%2x8Q24YN+&8U@CIlN zrZkcT9yh%LrlGS9`G)KdP(@9Eo-AQz@8GEFWcb7U=a0H^ZVbLmz{+&M7W(nXJ4sN8 zJLR7eeK(K8`2-}j(T7JsO`L!+CvbueT%izanm-^A1Dn{`1Nw`9P?cq;7no+XfC`K(GO9?O^5zNIt4M+M8LM0=7Gz8UA@Z0N+lg+cX)NfazRu z5D)~HA^(u%w^cz+@2@_#S|u>GpB+j4KzQ^&Wcl9f z&hG#bCA(Yk0D&t&aJE^xME^&E-&xGHhXn%}psEIj641H+Nl-}boj;)Zt*t(4wZ5DN z@GXF$bL=&pBq-#vkTkh>7hl%K5|3 z{`Vn9b$iR-SoGENp}bn4;fR3>9sA%X2@1L3aE9yTra;Wb#_`xWwLSLdfu+PAu+o3| zGVnpzPr=ch{uuoHjtw7+_!L_2;knQ!DuDl0R`|%jr+}jFzXtrHIKc323?JO{l&;VF z*L1+}JU7%QJOg|5|Tc|D8fN zJORAg=_vsy{ak|o);@)Yh8Lkcg@$FG3k@ep36BRa^>~UmnRPziS>Z=`Jb2x*Q#`%A zU*i3&Vg?TluO@X0O;r2Jl6LKLUOVhSqg1*qOt^|8*c7 zo(298@+r$k_wQNGHv{|$tW(T8L+4_`FQ{kEW5Jgg{yf7ey4ss_(SNKfz(N9lx&a;< je(UuV8hP?p&}TPdm1I$XmG#(RzlD&B2izSj9sl%y5~4qc diff --git a/gradle/wrapper/gradle-wrapper.properties b/gradle/wrapper/gradle-wrapper.properties index 3499ded5c..1af9e0930 100644 --- a/gradle/wrapper/gradle-wrapper.properties +++ b/gradle/wrapper/gradle-wrapper.properties @@ -2,5 +2,6 @@ distributionBase=GRADLE_USER_HOME distributionPath=wrapper/dists distributionUrl=https\://services.gradle.org/distributions/gradle-8.5-bin.zip networkTimeout=10000 +validateDistributionUrl=true zipStoreBase=GRADLE_USER_HOME zipStorePath=wrapper/dists diff --git a/release-notes/opensearch-anomaly-detection.release-notes-2.0.0.0-rc1.md b/release-notes/opensearch-anomaly-detection.release-notes-2.0.0.0-rc1.md new file mode 100644 index 000000000..1ca01d5e2 --- /dev/null +++ b/release-notes/opensearch-anomaly-detection.release-notes-2.0.0.0-rc1.md @@ -0,0 +1,35 @@ +## Version 2.0.0.0-rc1 Release Notes + +Compatible with OpenSearch 2.0.0-rc1 + +### Enhancements + +* changed usages of "master" to "clusterManager" in variable names ([#504](https://github.com/opensearch-project/anomaly-detection/pull/504)) + +### Bug Fixes + +* Changed default description to empty string instead of null ([#438](https://github.com/opensearch-project/anomaly-detection/pull/438)) +* Fixed ADTaskProfile toXContent bug and added to .gitignore ([#447](https://github.com/opensearch-project/anomaly-detection/pull/447)) +* Fix restart HCAD detector bug ([#460](https://github.com/opensearch-project/anomaly-detection/pull/460)) +* Check if indices exist in the presence of empty search results ([#495](https://github.com/opensearch-project/anomaly-detection/pull/495)) + +### Infrastructure + +* Reduced jacoco exclusions and added more tests ([#446](https://github.com/opensearch-project/anomaly-detection/pull/446)) +* Remove oss flavor ([#449](https://github.com/opensearch-project/anomaly-detection/pull/449)) +* Add auto labeler workflow ([#455](https://github.com/opensearch-project/anomaly-detection/pull/455)) +* Gradle 7 and Opensearch 2.0 upgrade ([#464](https://github.com/opensearch-project/anomaly-detection/pull/464)) +* Add support for -Dbuild.version_qualifier ([#468](https://github.com/opensearch-project/anomaly-detection/pull/468)) +* Changed forbiddenAPIsTest files and made relevant forbidden fixes ([#450](https://github.com/opensearch-project/anomaly-detection/pull/450)) +* Adding test-retry plugin ([#456](https://github.com/opensearch-project/anomaly-detection/pull/456)) +* Updated issue templates from .github. ([#488](https://github.com/opensearch-project/anomaly-detection/pull/488)) +* removing job-scheduler zip and replacing with distribution build ([#487](https://github.com/opensearch-project/anomaly-detection/pull/487)) +* JDK 17 support ([#489](https://github.com/opensearch-project/anomaly-detection/pull/489)) +* Moving script file in scripts folder for file location standardization ([#494](https://github.com/opensearch-project/anomaly-detection/pull/494)) +* Removed rcf jar for 3.0-rc1 and fixed zip fetching for AD and JS ([#500](https://github.com/opensearch-project/anomaly-detection/pull/500)) +* changed to rc1 and add tar to distribution download link ([#503](https://github.com/opensearch-project/anomaly-detection/pull/503)) +* Remove BWC zips for dynamic dependency ([#505](https://github.com/opensearch-project/anomaly-detection/pull/505)) + +### Documentation + +* Add Visualization integration RFC docs ([#477](https://github.com/opensearch-project/anomaly-detection/pull/477)) diff --git a/src/main/java/org/opensearch/ad/AnomalyDetectorJobRunner.java b/src/main/java/org/opensearch/ad/AnomalyDetectorJobRunner.java index 4e62548e5..98135e1ee 100644 --- a/src/main/java/org/opensearch/ad/AnomalyDetectorJobRunner.java +++ b/src/main/java/org/opensearch/ad/AnomalyDetectorJobRunner.java @@ -13,9 +13,9 @@ import static org.opensearch.action.DocWriteResponse.Result.CREATED; import static org.opensearch.action.DocWriteResponse.Result.UPDATED; -import static org.opensearch.ad.AnomalyDetectorPlugin.AD_THREAD_POOL_NAME; -import static org.opensearch.ad.util.RestHandlerUtils.XCONTENT_WITH_TYPE; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME; +import static org.opensearch.timeseries.util.RestHandlerUtils.XCONTENT_WITH_TYPE; import java.io.IOException; import java.time.Instant; @@ -30,21 +30,14 @@ import org.opensearch.action.get.GetResponse; import org.opensearch.action.index.IndexRequest; import org.opensearch.action.support.WriteRequest; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.common.exception.InternalFailure; -import org.opensearch.ad.indices.AnomalyDetectionIndices; -import org.opensearch.ad.model.ADTaskState; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.rest.handler.AnomalyDetectorFunction; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.task.ADTaskManager; import org.opensearch.ad.transport.AnomalyResultAction; import org.opensearch.ad.transport.AnomalyResultRequest; import org.opensearch.ad.transport.AnomalyResultResponse; import org.opensearch.ad.transport.AnomalyResultTransportAction; -import org.opensearch.ad.util.SecurityUtil; import org.opensearch.client.Client; import org.opensearch.common.settings.Settings; import org.opensearch.common.xcontent.LoggingDeprecationHandler; @@ -62,6 +55,16 @@ import org.opensearch.jobscheduler.spi.schedule.IntervalSchedule; import org.opensearch.jobscheduler.spi.utils.LockService; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.InternalFailure; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.model.TaskState; +import org.opensearch.timeseries.util.SecurityUtil; import com.google.common.base.Throwables; @@ -76,7 +79,7 @@ public class AnomalyDetectorJobRunner implements ScheduledJobRunner { private Client client; private ThreadPool threadPool; private ConcurrentHashMap detectorEndRunExceptionCount; - private AnomalyDetectionIndices anomalyDetectionIndices; + private ADIndexManagement anomalyDetectionIndices; private ADTaskManager adTaskManager; private NodeStateManager nodeStateManager; private ExecuteADResultResponseRecorder recorder; @@ -109,14 +112,14 @@ public void setThreadPool(ThreadPool threadPool) { public void setSettings(Settings settings) { this.settings = settings; - this.maxRetryForEndRunException = AnomalyDetectorSettings.MAX_RETRY_FOR_END_RUN_EXCEPTION.get(settings); + this.maxRetryForEndRunException = AnomalyDetectorSettings.AD_MAX_RETRY_FOR_END_RUN_EXCEPTION.get(settings); } public void setAdTaskManager(ADTaskManager adTaskManager) { this.adTaskManager = adTaskManager; } - public void setAnomalyDetectionIndices(AnomalyDetectionIndices anomalyDetectionIndices) { + public void setAnomalyDetectionIndices(ADIndexManagement anomalyDetectionIndices) { this.anomalyDetectionIndices = anomalyDetectionIndices; } @@ -133,12 +136,12 @@ public void runJob(ScheduledJobParameter scheduledJobParameter, JobExecutionCont String detectorId = scheduledJobParameter.getName(); log.info("Start to run AD job {}", detectorId); adTaskManager.refreshRealtimeJobRunTime(detectorId); - if (!(scheduledJobParameter instanceof AnomalyDetectorJob)) { + if (!(scheduledJobParameter instanceof Job)) { throw new IllegalArgumentException( - "Job parameter is not instance of AnomalyDetectorJob, type: " + scheduledJobParameter.getClass().getCanonicalName() + "Job parameter is not instance of Job, type: " + scheduledJobParameter.getClass().getCanonicalName() ); } - AnomalyDetectorJob jobParameter = (AnomalyDetectorJob) scheduledJobParameter; + Job jobParameter = (Job) scheduledJobParameter; Instant executionStartTime = Instant.now(); IntervalSchedule schedule = (IntervalSchedule) jobParameter.getSchedule(); Instant detectionStartTime = executionStartTime.minus(schedule.getInterval(), schedule.getUnit()); @@ -147,12 +150,12 @@ public void runJob(ScheduledJobParameter scheduledJobParameter, JobExecutionCont Runnable runnable = () -> { try { - nodeStateManager.getAnomalyDetector(detectorId, ActionListener.wrap(detectorOptional -> { + nodeStateManager.getConfig(detectorId, AnalysisType.AD, ActionListener.wrap(detectorOptional -> { if (!detectorOptional.isPresent()) { log.error(new ParameterizedMessage("fail to get detector [{}]", detectorId)); return; } - AnomalyDetector detector = detectorOptional.get(); + AnomalyDetector detector = (AnomalyDetector) detectorOptional.get(); if (jobParameter.getLockDurationSeconds() != null) { lockService @@ -215,7 +218,7 @@ public void runJob(ScheduledJobParameter scheduledJobParameter, JobExecutionCont * @param detector associated detector accessor */ protected void runAdJob( - AnomalyDetectorJob jobParameter, + Job jobParameter, LockService lockService, LockModel lock, Instant detectionStartTime, @@ -245,7 +248,7 @@ protected void runAdJob( String user = userInfo.getName(); List roles = userInfo.getRoles(); - String resultIndex = jobParameter.getResultIndex(); + String resultIndex = jobParameter.getCustomResultIndex(); if (resultIndex == null) { runAnomalyDetectionJob( jobParameter, @@ -283,7 +286,7 @@ protected void runAdJob( } private void runAnomalyDetectionJob( - AnomalyDetectorJob jobParameter, + Job jobParameter, LockService lockService, LockModel lock, Instant detectionStartTime, @@ -365,7 +368,7 @@ private void runAnomalyDetectionJob( * @param detector associated detector accessor */ protected void handleAdException( - AnomalyDetectorJob jobParameter, + Job jobParameter, LockService lockService, LockModel lock, Instant detectionStartTime, @@ -454,7 +457,7 @@ protected void handleAdException( } private void stopAdJobForEndRunException( - AnomalyDetectorJob jobParameter, + Job jobParameter, LockService lockService, LockModel lock, Instant detectionStartTime, @@ -479,15 +482,15 @@ private void stopAdJobForEndRunException( executionStartTime, error, true, - ADTaskState.STOPPED.name(), + TaskState.STOPPED.name(), recorder, detector ) ); } - private void stopAdJob(String detectorId, AnomalyDetectorFunction function) { - GetRequest getRequest = new GetRequest(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX).id(detectorId); + private void stopAdJob(String detectorId, ExecutorFunction function) { + GetRequest getRequest = new GetRequest(CommonName.JOB_INDEX).id(detectorId); ActionListener listener = ActionListener.wrap(response -> { if (response.isExists()) { try ( @@ -496,9 +499,9 @@ private void stopAdJob(String detectorId, AnomalyDetectorFunction function) { .createParser(NamedXContentRegistry.EMPTY, LoggingDeprecationHandler.INSTANCE, response.getSourceAsString()) ) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - AnomalyDetectorJob job = AnomalyDetectorJob.parse(parser); + Job job = Job.parse(parser); if (job.isEnabled()) { - AnomalyDetectorJob newJob = new AnomalyDetectorJob( + Job newJob = new Job( job.getName(), job.getSchedule(), job.getWindowDelay(), @@ -508,9 +511,9 @@ private void stopAdJob(String detectorId, AnomalyDetectorFunction function) { Instant.now(), job.getLockDurationSeconds(), job.getUser(), - job.getResultIndex() + job.getCustomResultIndex() ); - IndexRequest indexRequest = new IndexRequest(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX) + IndexRequest indexRequest = new IndexRequest(CommonName.JOB_INDEX) .setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE) .source(newJob.toXContent(XContentBuilder.builder(XContentType.JSON.xContent()), XCONTENT_WITH_TYPE)) .id(detectorId); @@ -538,7 +541,7 @@ private void stopAdJob(String detectorId, AnomalyDetectorFunction function) { } private void indexAnomalyResult( - AnomalyDetectorJob jobParameter, + Job jobParameter, LockService lockService, LockModel lock, Instant detectionStartTime, @@ -562,7 +565,7 @@ private void indexAnomalyResult( } private void indexAnomalyResultException( - AnomalyDetectorJob jobParameter, + Job jobParameter, LockService lockService, LockModel lock, Instant detectionStartTime, @@ -573,7 +576,7 @@ private void indexAnomalyResultException( AnomalyDetector detector ) { try { - String errorMessage = exception instanceof AnomalyDetectionException + String errorMessage = exception instanceof TimeSeriesException ? exception.getMessage() : Throwables.getStackTraceAsString(exception); indexAnomalyResultException( @@ -593,7 +596,7 @@ private void indexAnomalyResultException( } private void indexAnomalyResultException( - AnomalyDetectorJob jobParameter, + Job jobParameter, LockService lockService, LockModel lock, Instant detectionStartTime, @@ -618,7 +621,7 @@ private void indexAnomalyResultException( } private void indexAnomalyResultException( - AnomalyDetectorJob jobParameter, + Job jobParameter, LockService lockService, LockModel lock, Instant detectionStartTime, @@ -638,7 +641,7 @@ private void indexAnomalyResultException( } } - private void releaseLock(AnomalyDetectorJob jobParameter, LockService lockService, LockModel lock) { + private void releaseLock(Job jobParameter, LockService lockService, LockModel lock) { lockService .release( lock, diff --git a/src/main/java/org/opensearch/ad/AnomalyDetectorProfileRunner.java b/src/main/java/org/opensearch/ad/AnomalyDetectorProfileRunner.java index 02989a4f9..582f3e39a 100644 --- a/src/main/java/org/opensearch/ad/AnomalyDetectorProfileRunner.java +++ b/src/main/java/org/opensearch/ad/AnomalyDetectorProfileRunner.java @@ -11,13 +11,11 @@ package org.opensearch.ad; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_FIND_DETECTOR_MSG; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_PARSE_DETECTOR_MSG; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; import static org.opensearch.core.rest.RestStatus.BAD_REQUEST; import static org.opensearch.core.rest.RestStatus.INTERNAL_SERVER_ERROR; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.constant.CommonMessages.FAIL_TO_FIND_CONFIG_MSG; +import static org.opensearch.timeseries.constant.CommonMessages.FAIL_TO_PARSE_CONFIG_MSG; import java.util.List; import java.util.Map; @@ -32,20 +30,14 @@ import org.opensearch.action.get.GetRequest; import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; -import org.opensearch.ad.common.exception.NotSerializedADExceptionName; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.ADTaskType; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.DetectorProfile; import org.opensearch.ad.model.DetectorProfileName; import org.opensearch.ad.model.DetectorState; import org.opensearch.ad.model.InitProgressProfile; -import org.opensearch.ad.model.IntervalTimeConfiguration; -import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.settings.NumericSetting; +import org.opensearch.ad.settings.ADNumericSetting; import org.opensearch.ad.task.ADTaskManager; import org.opensearch.ad.transport.ProfileAction; import org.opensearch.ad.transport.ProfileRequest; @@ -53,10 +45,6 @@ import org.opensearch.ad.transport.RCFPollingAction; import org.opensearch.ad.transport.RCFPollingRequest; import org.opensearch.ad.transport.RCFPollingResponse; -import org.opensearch.ad.util.DiscoveryNodeFilterer; -import org.opensearch.ad.util.ExceptionUtil; -import org.opensearch.ad.util.MultiResponsesDelegateActionListener; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.common.xcontent.LoggingDeprecationHandler; @@ -74,6 +62,18 @@ import org.opensearch.search.aggregations.metrics.CardinalityAggregationBuilder; import org.opensearch.search.aggregations.metrics.InternalCardinality; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.common.exception.NotSerializedExceptionName; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; +import org.opensearch.timeseries.util.ExceptionUtil; +import org.opensearch.timeseries.util.MultiResponsesDelegateActionListener; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.TransportService; public class AnomalyDetectorProfileRunner extends AbstractProfileRunner { @@ -105,12 +105,12 @@ public AnomalyDetectorProfileRunner( } this.transportService = transportService; this.adTaskManager = adTaskManager; - this.maxTotalEntitiesToTrack = AnomalyDetectorSettings.MAX_TOTAL_ENTITIES_TO_TRACK; + this.maxTotalEntitiesToTrack = TimeSeriesSettings.MAX_TOTAL_ENTITIES_TO_TRACK; } public void profile(String detectorId, ActionListener listener, Set profilesToCollect) { if (profilesToCollect.isEmpty()) { - listener.onFailure(new IllegalArgumentException(CommonErrorMessages.EMPTY_PROFILES_COLLECT)); + listener.onFailure(new IllegalArgumentException(CommonMessages.EMPTY_PROFILES_COLLECT)); return; } calculateTotalResponsesToWait(detectorId, profilesToCollect, listener); @@ -121,7 +121,7 @@ private void calculateTotalResponsesToWait( Set profilesToCollect, ActionListener listener ) { - GetRequest getDetectorRequest = new GetRequest(ANOMALY_DETECTORS_INDEX, detectorId); + GetRequest getDetectorRequest = new GetRequest(CommonName.CONFIG_INDEX, detectorId); client.get(getDetectorRequest, ActionListener.wrap(getDetectorResponse -> { if (getDetectorResponse != null && getDetectorResponse.isExists()) { try ( @@ -133,15 +133,15 @@ private void calculateTotalResponsesToWait( AnomalyDetector detector = AnomalyDetector.parse(xContentParser, detectorId); prepareProfile(detector, listener, profilesToCollect); } catch (Exception e) { - logger.error(FAIL_TO_PARSE_DETECTOR_MSG + detectorId, e); - listener.onFailure(new OpenSearchStatusException(FAIL_TO_PARSE_DETECTOR_MSG + detectorId, BAD_REQUEST)); + logger.error(FAIL_TO_PARSE_CONFIG_MSG + detectorId, e); + listener.onFailure(new OpenSearchStatusException(FAIL_TO_PARSE_CONFIG_MSG + detectorId, BAD_REQUEST)); } } else { - listener.onFailure(new OpenSearchStatusException(FAIL_TO_FIND_DETECTOR_MSG + detectorId, BAD_REQUEST)); + listener.onFailure(new OpenSearchStatusException(FAIL_TO_FIND_CONFIG_MSG + detectorId, BAD_REQUEST)); } }, exception -> { - logger.error(FAIL_TO_FIND_DETECTOR_MSG + detectorId, exception); - listener.onFailure(new OpenSearchStatusException(FAIL_TO_FIND_DETECTOR_MSG + detectorId, INTERNAL_SERVER_ERROR)); + logger.error(FAIL_TO_FIND_CONFIG_MSG + detectorId, exception); + listener.onFailure(new OpenSearchStatusException(FAIL_TO_FIND_CONFIG_MSG + detectorId, INTERNAL_SERVER_ERROR)); })); } @@ -150,8 +150,8 @@ private void prepareProfile( ActionListener listener, Set profilesToCollect ) { - String detectorId = detector.getDetectorId(); - GetRequest getRequest = new GetRequest(ANOMALY_DETECTOR_JOB_INDEX, detectorId); + String detectorId = detector.getId(); + GetRequest getRequest = new GetRequest(CommonName.JOB_INDEX, detectorId); client.get(getRequest, ActionListener.wrap(getResponse -> { if (getResponse != null && getResponse.isExists()) { try ( @@ -160,10 +160,10 @@ private void prepareProfile( .createParser(xContentRegistry, LoggingDeprecationHandler.INSTANCE, getResponse.getSourceAsString()) ) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - AnomalyDetectorJob job = AnomalyDetectorJob.parse(parser); + Job job = Job.parse(parser); long enabledTimeMs = job.getEnabledTime().toEpochMilli(); - boolean isMultiEntityDetector = detector.isMultientityDetector(); + boolean isMultiEntityDetector = detector.isHighCardinality(); int totalResponsesToWait = 0; if (profilesToCollect.contains(DetectorProfileName.ERROR)) { @@ -208,7 +208,7 @@ private void prepareProfile( new MultiResponsesDelegateActionListener( listener, totalResponsesToWait, - CommonErrorMessages.FAIL_FETCH_ERR_MSG + detectorId, + CommonMessages.FAIL_FETCH_ERR_MSG + detectorId, false ); if (profilesToCollect.contains(DetectorProfileName.ERROR)) { @@ -267,7 +267,7 @@ private void prepareProfile( } } catch (Exception e) { - logger.error(CommonErrorMessages.FAIL_TO_GET_PROFILE_MSG, e); + logger.error(CommonMessages.FAIL_TO_GET_PROFILE_MSG, e); listener.onFailure(e); } } else { @@ -278,34 +278,34 @@ private void prepareProfile( logger.info(exception.getMessage()); onGetDetectorForPrepare(detectorId, listener, profilesToCollect); } else { - logger.error(CommonErrorMessages.FAIL_TO_GET_PROFILE_MSG + detectorId); + logger.error(CommonMessages.FAIL_TO_GET_PROFILE_MSG + detectorId); listener.onFailure(exception); } })); } private void profileEntityStats(MultiResponsesDelegateActionListener listener, AnomalyDetector detector) { - List categoryField = detector.getCategoryField(); - if (!detector.isMultientityDetector() || categoryField.size() > NumericSetting.maxCategoricalFields()) { + List categoryField = detector.getCategoryFields(); + if (!detector.isHighCardinality() || categoryField.size() > ADNumericSetting.maxCategoricalFields()) { listener.onResponse(new DetectorProfile.Builder().build()); } else { if (categoryField.size() == 1) { // Run a cardinality aggregation to count the cardinality of single category fields SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder(); - CardinalityAggregationBuilder aggBuilder = new CardinalityAggregationBuilder(CommonName.TOTAL_ENTITIES); + CardinalityAggregationBuilder aggBuilder = new CardinalityAggregationBuilder(ADCommonName.TOTAL_ENTITIES); aggBuilder.field(categoryField.get(0)); searchSourceBuilder.aggregation(aggBuilder); SearchRequest request = new SearchRequest(detector.getIndices().toArray(new String[0]), searchSourceBuilder); final ActionListener searchResponseListener = ActionListener.wrap(searchResponse -> { Map aggMap = searchResponse.getAggregations().asMap(); - InternalCardinality totalEntities = (InternalCardinality) aggMap.get(CommonName.TOTAL_ENTITIES); + InternalCardinality totalEntities = (InternalCardinality) aggMap.get(ADCommonName.TOTAL_ENTITIES); long value = totalEntities.getValue(); DetectorProfile.Builder profileBuilder = new DetectorProfile.Builder(); DetectorProfile profile = profileBuilder.totalEntities(value).build(); listener.onResponse(profile); }, searchException -> { - logger.warn(CommonErrorMessages.FAIL_TO_GET_TOTAL_ENTITIES + detector.getDetectorId()); + logger.warn(CommonMessages.FAIL_TO_GET_TOTAL_ENTITIES + detector.getId()); listener.onFailure(searchException); }); // using the original context in listener as user roles have no permissions for internal operations like fetching a @@ -314,16 +314,21 @@ private void profileEntityStats(MultiResponsesDelegateActionListenerasyncRequestWithInjectedSecurity( request, client::search, - detector.getDetectorId(), + detector.getId(), client, + AnalysisType.AD, searchResponseListener ); } else { // Run a composite query and count the number of buckets to decide cardinality of multiple category fields AggregationBuilder bucketAggs = AggregationBuilders .composite( - CommonName.TOTAL_ENTITIES, - detector.getCategoryField().stream().map(f -> new TermsValuesSourceBuilder(f).field(f)).collect(Collectors.toList()) + ADCommonName.TOTAL_ENTITIES, + detector + .getCategoryFields() + .stream() + .map(f -> new TermsValuesSourceBuilder(f).field(f)) + .collect(Collectors.toList()) ) .size(maxTotalEntitiesToTrack); SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder().aggregation(bucketAggs).trackTotalHits(false).size(0); @@ -344,7 +349,7 @@ private void profileEntityStats(MultiResponsesDelegateActionListener { - logger.warn(CommonErrorMessages.FAIL_TO_GET_TOTAL_ENTITIES + detector.getDetectorId()); + logger.warn(CommonMessages.FAIL_TO_GET_TOTAL_ENTITIES + detector.getId()); listener.onFailure(searchException); }); // using the original context in listener as user roles have no permissions for internal operations like fetching a @@ -363,8 +368,9 @@ private void profileEntityStats(MultiResponsesDelegateActionListenerasyncRequestWithInjectedSecurity( searchRequest, client::search, - detector.getDetectorId(), + detector.getId(), client, + AnalysisType.AD, searchResponseListener ); } @@ -401,7 +407,7 @@ private void profileStateRelated( Set profilesToCollect ) { if (enabled) { - RCFPollingRequest request = new RCFPollingRequest(detector.getDetectorId()); + RCFPollingRequest request = new RCFPollingRequest(detector.getId()); client.execute(RCFPollingAction.INSTANCE, request, onPollRCFUpdates(detector, profilesToCollect, listener)); } else { DetectorProfile.Builder builder = new DetectorProfile.Builder(); @@ -415,22 +421,22 @@ private void profileStateRelated( private void profileModels( AnomalyDetector detector, Set profiles, - AnomalyDetectorJob job, + Job job, boolean forMultiEntityDetector, MultiResponsesDelegateActionListener listener ) { DiscoveryNode[] dataNodes = nodeFilter.getEligibleDataNodes(); - ProfileRequest profileRequest = new ProfileRequest(detector.getDetectorId(), profiles, forMultiEntityDetector, dataNodes); + ProfileRequest profileRequest = new ProfileRequest(detector.getId(), profiles, forMultiEntityDetector, dataNodes); client.execute(ProfileAction.INSTANCE, profileRequest, onModelResponse(detector, profiles, job, listener));// get init progress } private ActionListener onModelResponse( AnomalyDetector detector, Set profilesToCollect, - AnomalyDetectorJob job, + Job job, MultiResponsesDelegateActionListener listener ) { - boolean isMultientityDetector = detector.isMultientityDetector(); + boolean isMultientityDetector = detector.isHighCardinality(); return ActionListener.wrap(profileResponse -> { DetectorProfile.Builder profile = new DetectorProfile.Builder(); if (profilesToCollect.contains(DetectorProfileName.COORDINATING_NODE)) { @@ -461,7 +467,7 @@ private ActionListener onModelResponse( } private void profileMultiEntityDetectorStateRelated( - AnomalyDetectorJob job, + Job job, Set profilesToCollect, ProfileResponse profileResponse, DetectorProfile.Builder profileBuilder, @@ -517,7 +523,7 @@ private ActionListener onInittedEver( logger .error( "Fail to find any anomaly result with anomaly score larger than 0 after AD job enabled time for detector {}", - detector.getDetectorId() + detector.getId() ); listener.onFailure(exception); } @@ -555,10 +561,10 @@ private ActionListener onPollRCFUpdates( .isException( causeException, ResourceNotFoundException.class, - NotSerializedADExceptionName.RESOURCE_NOT_FOUND_EXCEPTION_NAME_UNDERSCORE.getName() + NotSerializedExceptionName.RESOURCE_NOT_FOUND_EXCEPTION_NAME_UNDERSCORE.getName() ) || (ExceptionUtil.isIndexNotAvailable(causeException) - && causeException.getMessage().contains(CommonName.CHECKPOINT_INDEX_NAME))) { + && causeException.getMessage().contains(ADCommonName.CHECKPOINT_INDEX_NAME))) { // cannot find checkpoint // We don't want to show the estimated time remaining to initialize // a detector before cold start finishes, where the actual @@ -566,11 +572,7 @@ private ActionListener onPollRCFUpdates( // data exists. processInitResponse(detector, profilesToCollect, 0L, true, new DetectorProfile.Builder(), listener); } else { - logger - .error( - new ParameterizedMessage("Fail to get init progress through messaging for {}", detector.getDetectorId()), - exception - ); + logger.error(new ParameterizedMessage("Fail to get init progress through messaging for {}", detector.getId()), exception); listener.onFailure(exception); } }); @@ -604,7 +606,7 @@ private void processInitResponse( InitProgressProfile initProgress = computeInitProgressProfile(totalUpdates, 0); builder.initProgress(initProgress); } else { - long intervalMins = ((IntervalTimeConfiguration) detector.getDetectionInterval()).toDuration().toMinutes(); + long intervalMins = ((IntervalTimeConfiguration) detector.getInterval()).toDuration().toMinutes(); InitProgressProfile initProgress = computeInitProgressProfile(totalUpdates, intervalMins); builder.initProgress(initProgress); } diff --git a/src/main/java/org/opensearch/ad/AnomalyDetectorRunner.java b/src/main/java/org/opensearch/ad/AnomalyDetectorRunner.java index 22d3e4850..c5336316c 100644 --- a/src/main/java/org/opensearch/ad/AnomalyDetectorRunner.java +++ b/src/main/java/org/opensearch/ad/AnomalyDetectorRunner.java @@ -18,6 +18,7 @@ import java.util.List; import java.util.Locale; import java.util.Map; +import java.util.Optional; import java.util.stream.Collectors; import org.apache.logging.log4j.LogManager; @@ -30,13 +31,13 @@ import org.opensearch.ad.ml.ThresholdingResult; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.model.EntityAnomalyResult; -import org.opensearch.ad.model.Feature; -import org.opensearch.ad.model.FeatureData; -import org.opensearch.ad.util.MultiResponsesDelegateActionListener; import org.opensearch.common.util.concurrent.ThreadContext; import org.opensearch.core.action.ActionListener; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.FeatureData; +import org.opensearch.timeseries.util.MultiResponsesDelegateActionListener; /** * Runner to trigger an anomaly detector. @@ -72,7 +73,7 @@ public void executeDetector( ActionListener> listener ) throws IOException { context.restore(); - List categoryField = detector.getCategoryField(); + List categoryField = detector.getCategoryFields(); if (categoryField != null && !categoryField.isEmpty()) { featureManager.getPreviewEntities(detector, startTime.toEpochMilli(), endTime.toEpochMilli(), ActionListener.wrap(entities -> { @@ -85,12 +86,12 @@ public void executeDetector( } ActionListener entityAnomalyResultListener = ActionListener.wrap(entityAnomalyResult -> { listener.onResponse(entityAnomalyResult.getAnomalyResults()); - }, e -> onFailure(e, listener, detector.getDetectorId())); + }, e -> onFailure(e, listener, detector.getId())); MultiResponsesDelegateActionListener multiEntitiesResponseListener = new MultiResponsesDelegateActionListener( entityAnomalyResultListener, entities.size(), - String.format(Locale.ROOT, "Fail to get preview result for multi entity detector %s", detector.getDetectorId()), + String.format(Locale.ROOT, "Fail to get preview result for multi entity detector %s", detector.getId()), true ); for (Entity entity : entities) { @@ -111,7 +112,7 @@ public void executeDetector( }, e -> multiEntitiesResponseListener.onFailure(e)) ); } - }, e -> onFailure(e, listener, detector.getDetectorId()))); + }, e -> onFailure(e, listener, detector.getId()))); } else { featureManager.getPreviewFeatures(detector, startTime.toEpochMilli(), endTime.toEpochMilli(), ActionListener.wrap(features -> { try { @@ -119,9 +120,9 @@ public void executeDetector( .getPreviewResults(features.getProcessedFeatures(), detector.getShingleSize()); listener.onResponse(sample(parsePreviewResult(detector, features, results, null), maxPreviewResults)); } catch (Exception e) { - onFailure(e, listener, detector.getDetectorId()); + onFailure(e, listener, detector.getId()); } - }, e -> onFailure(e, listener, detector.getDetectorId()))); + }, e -> onFailure(e, listener, detector.getId()))); } } @@ -166,23 +167,26 @@ private List parsePreviewResult( AnomalyResult result; if (results != null && results.size() > i) { ThresholdingResult thresholdingResult = results.get(i); - result = thresholdingResult - .toAnomalyResult( + List resultsToSave = thresholdingResult + .toIndexableResults( detector, Instant.ofEpochMilli(timeRange.getKey()), Instant.ofEpochMilli(timeRange.getValue()), null, null, featureDatas, - entity, + Optional.ofNullable(entity), CommonValue.NO_SCHEMA_VERSION, null, null, null ); + for (AnomalyResult r : resultsToSave) { + anomalyResults.add(r); + } } else { result = new AnomalyResult( - detector.getDetectorId(), + detector.getId(), null, featureDatas, Instant.ofEpochMilli(timeRange.getKey()), @@ -190,14 +194,13 @@ private List parsePreviewResult( null, null, null, - entity, + Optional.ofNullable(entity), detector.getUser(), CommonValue.NO_SCHEMA_VERSION, null ); + anomalyResults.add(result); } - - anomalyResults.add(result); } } return anomalyResults; diff --git a/src/main/java/org/opensearch/ad/EntityProfileRunner.java b/src/main/java/org/opensearch/ad/EntityProfileRunner.java index 00c83e175..ce14dbef2 100644 --- a/src/main/java/org/opensearch/ad/EntityProfileRunner.java +++ b/src/main/java/org/opensearch/ad/EntityProfileRunner.java @@ -11,8 +11,6 @@ package org.opensearch.ad; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; import java.util.List; @@ -26,24 +24,17 @@ import org.opensearch.action.get.GetRequest; import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.model.EntityProfile; import org.opensearch.ad.model.EntityProfileName; import org.opensearch.ad.model.EntityState; import org.opensearch.ad.model.InitProgressProfile; -import org.opensearch.ad.model.IntervalTimeConfiguration; -import org.opensearch.ad.settings.NumericSetting; +import org.opensearch.ad.settings.ADNumericSetting; import org.opensearch.ad.transport.EntityProfileAction; import org.opensearch.ad.transport.EntityProfileRequest; import org.opensearch.ad.transport.EntityProfileResponse; -import org.opensearch.ad.util.MultiResponsesDelegateActionListener; -import org.opensearch.ad.util.ParseUtils; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.routing.Preference; import org.opensearch.common.xcontent.LoggingDeprecationHandler; @@ -58,6 +49,15 @@ import org.opensearch.index.query.TermQueryBuilder; import org.opensearch.search.aggregations.AggregationBuilders; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.util.MultiResponsesDelegateActionListener; +import org.opensearch.timeseries.util.ParseUtils; +import org.opensearch.timeseries.util.SecurityClientUtil; public class EntityProfileRunner extends AbstractProfileRunner { private final Logger logger = LogManager.getLogger(EntityProfileRunner.class); @@ -91,10 +91,10 @@ public void profile( ActionListener listener ) { if (profilesToCollect == null || profilesToCollect.size() == 0) { - listener.onFailure(new IllegalArgumentException(CommonErrorMessages.EMPTY_PROFILES_COLLECT)); + listener.onFailure(new IllegalArgumentException(CommonMessages.EMPTY_PROFILES_COLLECT)); return; } - GetRequest getDetectorRequest = new GetRequest(ANOMALY_DETECTORS_INDEX, detectorId); + GetRequest getDetectorRequest = new GetRequest(CommonName.CONFIG_INDEX, detectorId); client.get(getDetectorRequest, ActionListener.wrap(getResponse -> { if (getResponse != null && getResponse.isExists()) { @@ -105,13 +105,12 @@ public void profile( ) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); AnomalyDetector detector = AnomalyDetector.parse(parser, detectorId); - List categoryFields = detector.getCategoryField(); - int maxCategoryFields = NumericSetting.maxCategoricalFields(); + List categoryFields = detector.getCategoryFields(); + int maxCategoryFields = ADNumericSetting.maxCategoricalFields(); if (categoryFields == null || categoryFields.size() == 0) { listener.onFailure(new IllegalArgumentException(NOT_HC_DETECTOR_ERR_MSG)); } else if (categoryFields.size() > maxCategoryFields) { - listener - .onFailure(new IllegalArgumentException(CommonErrorMessages.getTooManyCategoricalFieldErr(maxCategoryFields))); + listener.onFailure(new IllegalArgumentException(CommonMessages.getTooManyCategoricalFieldErr(maxCategoryFields))); } else { validateEntity(entityValue, categoryFields, detectorId, profilesToCollect, detector, listener); } @@ -119,7 +118,7 @@ public void profile( listener.onFailure(t); } } else { - listener.onFailure(new IllegalArgumentException(CommonErrorMessages.FAIL_TO_FIND_DETECTOR_MSG + detectorId)); + listener.onFailure(new IllegalArgumentException(CommonMessages.FAIL_TO_FIND_CONFIG_MSG + detectorId)); } }, listener::onFailure)); } @@ -187,8 +186,9 @@ private void validateEntity( .asyncRequestWithInjectedSecurity( searchRequest, client::search, - detector.getDetectorId(), + detector.getId(), client, + AnalysisType.AD, searchResponseListener ); @@ -220,7 +220,7 @@ private void getJob( EntityProfileResponse entityProfileResponse, ActionListener listener ) { - GetRequest getRequest = new GetRequest(ANOMALY_DETECTOR_JOB_INDEX, detectorId); + GetRequest getRequest = new GetRequest(CommonName.JOB_INDEX, detectorId); client.get(getRequest, ActionListener.wrap(getResponse -> { if (getResponse != null && getResponse.isExists()) { try ( @@ -229,7 +229,7 @@ private void getJob( .createParser(xContentRegistry, LoggingDeprecationHandler.INSTANCE, getResponse.getSourceAsString()) ) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - AnomalyDetectorJob job = AnomalyDetectorJob.parse(parser); + Job job = Job.parse(parser); int totalResponsesToWait = 0; if (profilesToCollect.contains(EntityProfileName.INIT_PROGRESS) @@ -246,7 +246,7 @@ private void getJob( new MultiResponsesDelegateActionListener( listener, totalResponsesToWait, - CommonErrorMessages.FAIL_FETCH_ERR_MSG + entityValue + " of detector " + detectorId, + CommonMessages.FAIL_FETCH_ERR_MSG + entityValue + " of detector " + detectorId, false ); @@ -278,7 +278,7 @@ private void getJob( detectorId, enabledTimeMs, entityValue, - detector.getResultIndex() + detector.getCustomResultIndex() ); EntityProfile.Builder builder = new EntityProfile.Builder(); @@ -309,7 +309,7 @@ private void getJob( })); } } catch (Exception e) { - logger.error(CommonErrorMessages.FAIL_TO_GET_PROFILE_MSG, e); + logger.error(CommonMessages.FAIL_TO_GET_PROFILE_MSG, e); listener.onFailure(e); } } else { @@ -320,7 +320,7 @@ private void getJob( logger.info(exception.getMessage()); sendUnknownState(profilesToCollect, entityValue, true, listener); } else { - logger.error(CommonErrorMessages.FAIL_TO_GET_PROFILE_MSG + detectorId, exception); + logger.error(CommonMessages.FAIL_TO_GET_PROFILE_MSG + detectorId, exception); listener.onFailure(exception); } })); @@ -332,7 +332,7 @@ private void profileStateRelated( Entity entityValue, Set profilesToCollect, AnomalyDetector detector, - AnomalyDetectorJob job, + Job job, MultiResponsesDelegateActionListener delegateListener ) { if (totalUpdates == 0) { @@ -398,7 +398,7 @@ private void sendInitState( builder.state(EntityState.INIT); } if (profilesToCollect.contains(EntityProfileName.INIT_PROGRESS)) { - long intervalMins = ((IntervalTimeConfiguration) detector.getDetectionInterval()).toDuration().toMinutes(); + long intervalMins = ((IntervalTimeConfiguration) detector.getInterval()).toDuration().toMinutes(); InitProgressProfile initProgress = computeInitProgressProfile(updates, intervalMins); builder.initProgress(initProgress); } @@ -457,15 +457,15 @@ private SearchRequest createLastSampleTimeRequest(String detectorId, long enable boolQueryBuilder.filter(QueryBuilders.termQuery(AnomalyResult.DETECTOR_ID_FIELD, detectorId)); - boolQueryBuilder.filter(QueryBuilders.rangeQuery(AnomalyResult.EXECUTION_END_TIME_FIELD).gte(enabledTime)); + boolQueryBuilder.filter(QueryBuilders.rangeQuery(CommonName.EXECUTION_END_TIME_FIELD).gte(enabledTime)); SearchSourceBuilder source = new SearchSourceBuilder() .query(boolQueryBuilder) - .aggregation(AggregationBuilders.max(CommonName.AGG_NAME_MAX_TIME).field(AnomalyResult.EXECUTION_END_TIME_FIELD)) + .aggregation(AggregationBuilders.max(CommonName.AGG_NAME_MAX_TIME).field(CommonName.EXECUTION_END_TIME_FIELD)) .trackTotalHits(false) .size(0); - SearchRequest request = new SearchRequest(CommonName.ANOMALY_RESULT_INDEX_ALIAS); + SearchRequest request = new SearchRequest(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS); request.source(source); if (resultIndex != null) { request.indices(resultIndex); diff --git a/src/main/java/org/opensearch/ad/ExecuteADResultResponseRecorder.java b/src/main/java/org/opensearch/ad/ExecuteADResultResponseRecorder.java index 414db3a70..6590cead6 100644 --- a/src/main/java/org/opensearch/ad/ExecuteADResultResponseRecorder.java +++ b/src/main/java/org/opensearch/ad/ExecuteADResultResponseRecorder.java @@ -11,28 +11,24 @@ package org.opensearch.ad; -import static org.opensearch.ad.constant.CommonErrorMessages.CAN_NOT_FIND_LATEST_TASK; +import static org.opensearch.timeseries.constant.CommonMessages.CAN_NOT_FIND_LATEST_TASK; import java.time.Instant; import java.util.ArrayList; import java.util.HashSet; +import java.util.Optional; import java.util.Set; import java.util.concurrent.TimeUnit; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.opensearch.action.update.UpdateResponse; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.indices.ADIndex; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.ad.model.DetectorProfileName; -import org.opensearch.ad.model.FeatureData; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.ad.task.ADTaskCacheManager; import org.opensearch.ad.task.ADTaskManager; import org.opensearch.ad.transport.AnomalyResultResponse; @@ -41,8 +37,6 @@ import org.opensearch.ad.transport.RCFPollingAction; import org.opensearch.ad.transport.RCFPollingRequest; import org.opensearch.ad.transport.handler.AnomalyIndexHandler; -import org.opensearch.ad.util.DiscoveryNodeFilterer; -import org.opensearch.ad.util.ExceptionUtil; import org.opensearch.client.Client; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.common.unit.TimeValue; @@ -50,11 +44,21 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.search.SearchHits; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.model.FeatureData; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; +import org.opensearch.timeseries.util.ExceptionUtil; public class ExecuteADResultResponseRecorder { private static final Logger log = LogManager.getLogger(ExecuteADResultResponseRecorder.class); - private AnomalyDetectionIndices anomalyDetectionIndices; + private ADIndexManagement anomalyDetectionIndices; private AnomalyIndexHandler anomalyResultHandler; private ADTaskManager adTaskManager; private DiscoveryNodeFilterer nodeFilter; @@ -65,7 +69,7 @@ public class ExecuteADResultResponseRecorder { private int rcfMinSamples; public ExecuteADResultResponseRecorder( - AnomalyDetectionIndices anomalyDetectionIndices, + ADIndexManagement anomalyDetectionIndices, AnomalyIndexHandler anomalyResultHandler, ADTaskManager adTaskManager, DiscoveryNodeFilterer nodeFilter, @@ -92,7 +96,7 @@ public void indexAnomalyResult( AnomalyResultResponse response, AnomalyDetector detector ) { - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); try { // skipping writing to the result index if not necessary // For a single-entity detector, the result is not useful if error is null @@ -124,7 +128,7 @@ public void indexAnomalyResult( response.getError() ); - String resultIndex = detector.getResultIndex(); + String resultIndex = detector.getCustomResultIndex(); anomalyResultHandler.index(anomalyResult, detectorId, resultIndex); updateRealtimeTask(response, detectorId); } catch (EndRunException e) { @@ -156,20 +160,15 @@ private void updateRealtimeTask(AnomalyResultResponse response, String detectorI Runnable profileHCInitProgress = () -> { client.execute(ProfileAction.INSTANCE, profileRequest, ActionListener.wrap(r -> { log.debug("Update latest realtime task for HC detector {}, total updates: {}", detectorId, r.getTotalUpdates()); - updateLatestRealtimeTask( - detectorId, - null, - r.getTotalUpdates(), - response.getDetectorIntervalInMinutes(), - response.getError() - ); + updateLatestRealtimeTask(detectorId, null, r.getTotalUpdates(), response.getIntervalInMinutes(), response.getError()); }, e -> { log.error("Failed to update latest realtime task for " + detectorId, e); })); }; if (!adTaskManager.isHCRealtimeTaskStartInitializing(detectorId)) { // real time init progress is 0 may mean this is a newly started detector // Delay real time cache update by one minute. If we are in init status, the delay may give the model training time to // finish. We can change the detector running immediately instead of waiting for the next interval. - threadPool.schedule(profileHCInitProgress, new TimeValue(60, TimeUnit.SECONDS), AnomalyDetectorPlugin.AD_THREAD_POOL_NAME); + threadPool + .schedule(profileHCInitProgress, new TimeValue(60, TimeUnit.SECONDS), TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME); } else { profileHCInitProgress.run(); } @@ -181,13 +180,7 @@ private void updateRealtimeTask(AnomalyResultResponse response, String detectorI detectorId, response.getRcfTotalUpdates() ); - updateLatestRealtimeTask( - detectorId, - null, - response.getRcfTotalUpdates(), - response.getDetectorIntervalInMinutes(), - response.getError() - ); + updateLatestRealtimeTask(detectorId, null, response.getRcfTotalUpdates(), response.getIntervalInMinutes(), response.getError()); } } @@ -278,7 +271,7 @@ public void indexAnomalyResultException( String taskState, AnomalyDetector detector ) { - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); try { IntervalTimeConfiguration windowDelay = (IntervalTimeConfiguration) detector.getWindowDelay(); Instant dataStartTime = detectionStartTime.minus(windowDelay.getInterval(), windowDelay.getUnit()); @@ -294,12 +287,12 @@ public void indexAnomalyResultException( executionStartTime, Instant.now(), errorMessage, - null, // single-stream detectors have no entity + Optional.empty(), // single-stream detectors have no entity user, anomalyDetectionIndices.getSchemaVersion(ADIndex.RESULT), null // no model id ); - String resultIndex = detector.getResultIndex(); + String resultIndex = detector.getCustomResultIndex(); if (resultIndex != null && !anomalyDetectionIndices.doesIndexExist(resultIndex)) { // Set result index as null, will write exception to default result index. anomalyResultHandler.index(anomalyResult, detectorId, null); @@ -307,7 +300,7 @@ public void indexAnomalyResultException( anomalyResultHandler.index(anomalyResult, detectorId, resultIndex); } - if (errorMessage.contains(CommonErrorMessages.NO_MODEL_ERR_MSG) && !detector.isMultiCategoryDetector()) { + if (errorMessage.contains(ADCommonMessages.NO_MODEL_ERR_MSG) && !detector.isHighCardinality()) { // single stream detector raises ResourceNotFoundException containing CommonErrorMessages.NO_CHECKPOINT_ERR_MSG // when there is no checkpoint. // Delay real time cache update by one minute so we will have trained models by then and update the state @@ -321,14 +314,14 @@ public void indexAnomalyResultException( detectorId, taskState, totalUpdates, - detector.getDetectorIntervalInMinutes(), + detector.getIntervalInMinutes(), totalUpdates > 0 ? "" : errorMessage ); }, e -> { log.error("Fail to execute RCFRollingAction", e); updateLatestRealtimeTask(detectorId, taskState, null, null, errorMessage); })); - }, new TimeValue(60, TimeUnit.SECONDS), AnomalyDetectorPlugin.AD_THREAD_POOL_NAME); + }, new TimeValue(60, TimeUnit.SECONDS), TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME); } else { updateLatestRealtimeTask(detectorId, taskState, null, null, errorMessage); } @@ -346,20 +339,20 @@ private void confirmTotalRCFUpdatesFound( String error, ActionListener listener ) { - nodeStateManager.getAnomalyDetector(detectorId, ActionListener.wrap(detectorOptional -> { + nodeStateManager.getConfig(detectorId, AnalysisType.AD, ActionListener.wrap(detectorOptional -> { if (!detectorOptional.isPresent()) { - listener.onFailure(new AnomalyDetectionException(detectorId, "fail to get detector")); + listener.onFailure(new TimeSeriesException(detectorId, "fail to get detector")); return; } - nodeStateManager.getAnomalyDetectorJob(detectorId, ActionListener.wrap(jobOptional -> { + nodeStateManager.getJob(detectorId, ActionListener.wrap(jobOptional -> { if (!jobOptional.isPresent()) { - listener.onFailure(new AnomalyDetectionException(detectorId, "fail to get job")); + listener.onFailure(new TimeSeriesException(detectorId, "fail to get job")); return; } ProfileUtil .confirmDetectorRealtimeInitStatus( - detectorOptional.get(), + (AnomalyDetector) detectorOptional.get(), jobOptional.get().getEnabledTime().toEpochMilli(), client, ActionListener.wrap(searchResponse -> { @@ -384,7 +377,7 @@ private void confirmTotalRCFUpdatesFound( } }) ); - }, e -> listener.onFailure(new AnomalyDetectionException(detectorId, "fail to get job")))); - }, e -> listener.onFailure(new AnomalyDetectionException(detectorId, "fail to get detector")))); + }, e -> listener.onFailure(new TimeSeriesException(detectorId, "fail to get job")))); + }, e -> listener.onFailure(new TimeSeriesException(detectorId, "fail to get detector")))); } } diff --git a/src/main/java/org/opensearch/ad/ProfileUtil.java b/src/main/java/org/opensearch/ad/ProfileUtil.java index d44125cc9..3d77924d0 100644 --- a/src/main/java/org/opensearch/ad/ProfileUtil.java +++ b/src/main/java/org/opensearch/ad/ProfileUtil.java @@ -13,7 +13,7 @@ import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.client.Client; @@ -22,6 +22,7 @@ import org.opensearch.index.query.ExistsQueryBuilder; import org.opensearch.index.query.QueryBuilders; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.constant.CommonName; public class ProfileUtil { /** @@ -35,16 +36,16 @@ public class ProfileUtil { private static SearchRequest createRealtimeInittedEverRequest(String detectorId, long enabledTime, String resultIndex) { BoolQueryBuilder filterQuery = new BoolQueryBuilder(); filterQuery.filter(QueryBuilders.termQuery(AnomalyResult.DETECTOR_ID_FIELD, detectorId)); - filterQuery.filter(QueryBuilders.rangeQuery(AnomalyResult.EXECUTION_END_TIME_FIELD).gte(enabledTime)); + filterQuery.filter(QueryBuilders.rangeQuery(CommonName.EXECUTION_END_TIME_FIELD).gte(enabledTime)); filterQuery.filter(QueryBuilders.rangeQuery(AnomalyResult.ANOMALY_SCORE_FIELD).gt(0)); // Historical analysis result also stored in result index, which has non-null task_id. // For realtime detection result, we should filter task_id == null - ExistsQueryBuilder taskIdExistsFilter = QueryBuilders.existsQuery(AnomalyResult.TASK_ID_FIELD); + ExistsQueryBuilder taskIdExistsFilter = QueryBuilders.existsQuery(CommonName.TASK_ID_FIELD); filterQuery.mustNot(taskIdExistsFilter); SearchSourceBuilder source = new SearchSourceBuilder().query(filterQuery).size(1); - SearchRequest request = new SearchRequest(CommonName.ANOMALY_RESULT_INDEX_ALIAS); + SearchRequest request = new SearchRequest(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS); request.source(source); if (resultIndex != null) { request.indices(resultIndex); @@ -58,11 +59,7 @@ public static void confirmDetectorRealtimeInitStatus( Client client, ActionListener listener ) { - SearchRequest searchLatestResult = createRealtimeInittedEverRequest( - detector.getDetectorId(), - enabledTime, - detector.getResultIndex() - ); + SearchRequest searchLatestResult = createRealtimeInittedEverRequest(detector.getId(), enabledTime, detector.getCustomResultIndex()); client.search(searchLatestResult, listener); } } diff --git a/src/main/java/org/opensearch/ad/caching/CacheBuffer.java b/src/main/java/org/opensearch/ad/caching/CacheBuffer.java index bed7052c9..fb48fd273 100644 --- a/src/main/java/org/opensearch/ad/caching/CacheBuffer.java +++ b/src/main/java/org/opensearch/ad/caching/CacheBuffer.java @@ -25,9 +25,6 @@ import org.apache.commons.lang.builder.HashCodeBuilder; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.ExpiringState; -import org.opensearch.ad.MemoryTracker; -import org.opensearch.ad.MemoryTracker.Origin; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelState; import org.opensearch.ad.model.InitProgressProfile; @@ -36,6 +33,9 @@ import org.opensearch.ad.ratelimit.CheckpointWriteWorker; import org.opensearch.ad.ratelimit.RequestPriority; import org.opensearch.ad.util.DateUtils; +import org.opensearch.timeseries.ExpiringState; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.MemoryTracker.Origin; /** * We use a layered cache to manage active entities’ states. We have a two-level @@ -159,7 +159,7 @@ private void put(String entityModelId, ModelState value, float prio // Since we have already considered them while allocating CacheBuffer, // skip bookkeeping. if (!sharedCacheEmpty()) { - memoryTracker.consumeMemory(memoryConsumptionPerEntity, false, Origin.HC_DETECTOR); + memoryTracker.consumeMemory(memoryConsumptionPerEntity, false, Origin.REAL_TIME_DETECTOR); } } else { update(entityModelId); @@ -267,7 +267,7 @@ public ModelState remove(String keyToRemove, boolean saveCheckpoint if (valueRemoved != null) { if (!reserved) { // release in shared memory - memoryTracker.releaseMemory(memoryConsumptionPerEntity, false, Origin.HC_DETECTOR); + memoryTracker.releaseMemory(memoryConsumptionPerEntity, false, Origin.REAL_TIME_DETECTOR); } EntityModel modelRemoved = valueRemoved.getModel(); @@ -460,9 +460,9 @@ public void clear() { // not a problem as we are releasing memory in MemoryTracker. // The newly added one loses references and soon GC will collect it. // We have memory tracking correction to fix incorrect memory usage record. - memoryTracker.releaseMemory(getReservedBytes(), true, Origin.HC_DETECTOR); + memoryTracker.releaseMemory(getReservedBytes(), true, Origin.REAL_TIME_DETECTOR); if (!sharedCacheEmpty()) { - memoryTracker.releaseMemory(getBytesInSharedCache(), false, Origin.HC_DETECTOR); + memoryTracker.releaseMemory(getBytesInSharedCache(), false, Origin.REAL_TIME_DETECTOR); } items.clear(); priorityTracker.clearPriority(); @@ -517,7 +517,7 @@ public boolean expired(Duration stateTtl) { return expired(lastUsedTime, stateTtl, clock.instant()); } - public String getDetectorId() { + public String getId() { return detectorId; } diff --git a/src/main/java/org/opensearch/ad/caching/DoorKeeper.java b/src/main/java/org/opensearch/ad/caching/DoorKeeper.java index 96a18d8f6..5bb5e3cd5 100644 --- a/src/main/java/org/opensearch/ad/caching/DoorKeeper.java +++ b/src/main/java/org/opensearch/ad/caching/DoorKeeper.java @@ -17,8 +17,8 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.ExpiringState; -import org.opensearch.ad.MaintenanceState; +import org.opensearch.timeseries.ExpiringState; +import org.opensearch.timeseries.MaintenanceState; import com.google.common.base.Charsets; import com.google.common.hash.BloomFilter; diff --git a/src/main/java/org/opensearch/ad/caching/EntityCache.java b/src/main/java/org/opensearch/ad/caching/EntityCache.java index 3db3d19c8..287994efd 100644 --- a/src/main/java/org/opensearch/ad/caching/EntityCache.java +++ b/src/main/java/org/opensearch/ad/caching/EntityCache.java @@ -16,14 +16,14 @@ import java.util.Optional; import org.apache.commons.lang3.tuple.Pair; -import org.opensearch.ad.CleanState; import org.opensearch.ad.DetectorModelSize; -import org.opensearch.ad.MaintenanceState; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelState; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.model.ModelProfile; +import org.opensearch.timeseries.CleanState; +import org.opensearch.timeseries.MaintenanceState; +import org.opensearch.timeseries.model.Entity; public interface EntityCache extends MaintenanceState, CleanState, DetectorModelSize { /** diff --git a/src/main/java/org/opensearch/ad/caching/PriorityCache.java b/src/main/java/org/opensearch/ad/caching/PriorityCache.java index b64fc6d0e..40e28975d 100644 --- a/src/main/java/org/opensearch/ad/caching/PriorityCache.java +++ b/src/main/java/org/opensearch/ad/caching/PriorityCache.java @@ -11,8 +11,8 @@ package org.opensearch.ad.caching; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.DEDICATED_CACHE_SIZE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_DEDICATED_CACHE_SIZE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE; import java.time.Clock; import java.time.Duration; @@ -38,23 +38,15 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.message.ParameterizedMessage; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.MemoryTracker; -import org.opensearch.ad.MemoryTracker.Origin; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.LimitExceededException; -import org.opensearch.ad.constant.CommonErrorMessages; import org.opensearch.ad.ml.CheckpointDao; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelManager.ModelType; import org.opensearch.ad.ml.ModelState; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.model.ModelProfile; import org.opensearch.ad.ratelimit.CheckpointMaintainWorker; import org.opensearch.ad.ratelimit.CheckpointWriteWorker; -import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.util.DateUtils; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Setting; @@ -63,6 +55,14 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.core.common.Strings; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.MemoryTracker.Origin; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import com.google.common.cache.Cache; import com.google.common.cache.CacheBuilder; @@ -116,12 +116,12 @@ public PriorityCache( this.activeEnities = new ConcurrentHashMap<>(); this.dedicatedCacheSize = dedicatedCacheSize; - clusterService.getClusterSettings().addSettingsUpdateConsumer(DEDICATED_CACHE_SIZE, (it) -> { + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_DEDICATED_CACHE_SIZE, (it) -> { this.dedicatedCacheSize = it; this.setDedicatedCacheSizeListener(); this.tryClearUpMemory(); }, this::validateDedicatedCacheSize); - clusterService.getClusterSettings().addSettingsUpdateConsumer(MODEL_MAX_SIZE_PERCENTAGE, it -> this.tryClearUpMemory()); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_MODEL_MAX_SIZE_PERCENTAGE, it -> this.tryClearUpMemory()); this.memoryTracker = memoryTracker; this.maintenanceLock = new ReentrantLock(); @@ -153,19 +153,19 @@ public PriorityCache( @Override public ModelState get(String modelId, AnomalyDetector detector) { - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); CacheBuffer buffer = computeBufferIfAbsent(detector, detectorId); ModelState modelState = buffer.get(modelId); // during maintenance period, stop putting new entries if (!maintenanceLock.isLocked() && modelState == null) { - if (EnabledSetting.isDoorKeeperInCacheEnabled()) { + if (ADEnabledSetting.isDoorKeeperInCacheEnabled()) { DoorKeeper doorKeeper = doorKeepers.computeIfAbsent(detectorId, id -> { // reset every 60 intervals return new DoorKeeper( - AnomalyDetectorSettings.DOOR_KEEPER_FOR_CACHE_MAX_INSERTION, - AnomalyDetectorSettings.DOOR_KEEPER_FAULSE_POSITIVE_RATE, - detector.getDetectionIntervalDuration().multipliedBy(AnomalyDetectorSettings.DOOR_KEEPER_MAINTENANCE_FREQ), + TimeSeriesSettings.DOOR_KEEPER_FOR_CACHE_MAX_INSERTION, + TimeSeriesSettings.DOOR_KEEPER_FALSE_POSITIVE_RATE, + detector.getIntervalDuration().multipliedBy(TimeSeriesSettings.DOOR_KEEPER_MAINTENANCE_FREQ), clock ); }); @@ -244,7 +244,7 @@ public boolean hostIfPossible(AnomalyDetector detector, ModelState return false; } String modelId = toUpdate.getModelId(); - String detectorId = toUpdate.getDetectorId(); + String detectorId = toUpdate.getId(); if (Strings.isEmpty(modelId) || Strings.isEmpty(detectorId)) { return false; @@ -454,8 +454,8 @@ private CacheBuffer computeBufferIfAbsent(AnomalyDetector detector, String detec if (buffer == null) { long requiredBytes = getRequiredMemory(detector, dedicatedCacheSize); if (memoryTracker.canAllocateReserved(requiredBytes)) { - memoryTracker.consumeMemory(requiredBytes, true, Origin.HC_DETECTOR); - long intervalSecs = detector.getDetectorIntervalInSeconds(); + memoryTracker.consumeMemory(requiredBytes, true, Origin.REAL_TIME_DETECTOR); + long intervalSecs = detector.getIntervalInSeconds(); buffer = new CacheBuffer( dedicatedCacheSize, @@ -475,7 +475,7 @@ private CacheBuffer computeBufferIfAbsent(AnomalyDetector detector, String detec // Put tryClearUpMemory after consumeMemory to prevent that. tryClearUpMemory(); } else { - throw new LimitExceededException(detectorId, CommonErrorMessages.MEMORY_LIMIT_EXCEEDED_ERR_MSG); + throw new LimitExceededException(detectorId, CommonMessages.MEMORY_LIMIT_EXCEEDED_ERR_MSG); } } @@ -494,7 +494,7 @@ private long getRequiredMemory(AnomalyDetector detector, int numberOfEntity) { .estimateTRCFModelSize( dimension, numberOfTrees, - AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO, + TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO, detector.getShingleSize().intValue(), true ); @@ -541,7 +541,7 @@ private Triple canReplaceInSharedCache(CacheBuffer o private void tryClearUpMemory() { try { if (maintenanceLock.tryLock()) { - threadPool.executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME).execute(() -> clearMemory()); + threadPool.executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME).execute(() -> clearMemory()); } else { threadPool.schedule(() -> { try { @@ -549,7 +549,7 @@ private void tryClearUpMemory() { } catch (Exception e) { LOG.error("Fail to clear up memory taken by CacheBuffer. Will retry during maintenance."); } - }, new TimeValue(random.nextInt(90), TimeUnit.SECONDS), AnomalyDetectorPlugin.AD_THREAD_POOL_NAME); + }, new TimeValue(random.nextInt(90), TimeUnit.SECONDS), TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME); } } finally { if (maintenanceLock.isHeldByCurrentThread()) { @@ -614,7 +614,7 @@ private void recalculateUsedMemory() { reserved += buffer.getReservedBytes(); shared += buffer.getBytesInSharedCache(); } - memoryTracker.syncMemoryState(Origin.HC_DETECTOR, reserved + shared, reserved); + memoryTracker.syncMemoryState(Origin.REAL_TIME_DETECTOR, reserved + shared, reserved); } /** @@ -658,7 +658,7 @@ public void maintenance() { }); } catch (Exception e) { // will be thrown to ES's transport broadcast handler - throw new AnomalyDetectionException("Fail to maintain cache", e); + throw new TimeSeriesException("Fail to maintain cache", e); } } diff --git a/src/main/java/org/opensearch/ad/caching/PriorityTracker.java b/src/main/java/org/opensearch/ad/caching/PriorityTracker.java index 05304912f..439d67679 100644 --- a/src/main/java/org/opensearch/ad/caching/PriorityTracker.java +++ b/src/main/java/org/opensearch/ad/caching/PriorityTracker.java @@ -27,7 +27,7 @@ import org.apache.commons.lang.builder.ToStringBuilder; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.annotation.Generated; +import org.opensearch.timeseries.annotation.Generated; /** * A priority tracker for entities. Read docs/entity-priority.pdf for details. diff --git a/src/main/java/org/opensearch/ad/cluster/ADDataMigrator.java b/src/main/java/org/opensearch/ad/cluster/ADDataMigrator.java index 337104735..4050c22f5 100644 --- a/src/main/java/org/opensearch/ad/cluster/ADDataMigrator.java +++ b/src/main/java/org/opensearch/ad/cluster/ADDataMigrator.java @@ -11,17 +11,15 @@ package org.opensearch.ad.cluster; -import static org.opensearch.ad.constant.CommonName.DETECTION_STATE_INDEX; +import static org.opensearch.ad.constant.ADCommonName.DETECTION_STATE_INDEX; import static org.opensearch.ad.model.ADTask.DETECTOR_ID_FIELD; import static org.opensearch.ad.model.ADTask.IS_LATEST_FIELD; import static org.opensearch.ad.model.ADTask.TASK_TYPE_FIELD; -import static org.opensearch.ad.model.ADTaskType.taskTypeToString; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_DETECTOR_UPPER_LIMIT; -import static org.opensearch.ad.util.RestHandlerUtils.XCONTENT_WITH_TYPE; -import static org.opensearch.ad.util.RestHandlerUtils.createXContentParserFromRegistry; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.model.TaskType.taskTypeToString; +import static org.opensearch.timeseries.util.RestHandlerUtils.XCONTENT_WITH_TYPE; +import static org.opensearch.timeseries.util.RestHandlerUtils.createXContentParserFromRegistry; import java.io.IOException; import java.time.Instant; @@ -37,17 +35,12 @@ import org.opensearch.action.index.IndexRequest; import org.opensearch.action.search.SearchRequest; import org.opensearch.action.support.WriteRequest; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.ADTask; -import org.opensearch.ad.model.ADTaskState; import org.opensearch.ad.model.ADTaskType; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.DetectorInternalState; -import org.opensearch.ad.rest.handler.AnomalyDetectorFunction; -import org.opensearch.ad.util.ExceptionUtil; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.xcontent.XContentFactory; @@ -61,6 +54,12 @@ import org.opensearch.index.query.TermsQueryBuilder; import org.opensearch.search.SearchHit; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.model.TaskState; +import org.opensearch.timeseries.util.ExceptionUtil; /** * Migrate AD data to support backward compatibility. @@ -72,14 +71,14 @@ public class ADDataMigrator { private final Client client; private final ClusterService clusterService; private final NamedXContentRegistry xContentRegistry; - private final AnomalyDetectionIndices detectionIndices; + private final ADIndexManagement detectionIndices; private final AtomicBoolean dataMigrated; public ADDataMigrator( Client client, ClusterService clusterService, NamedXContentRegistry xContentRegistry, - AnomalyDetectionIndices detectionIndices + ADIndexManagement detectionIndices ) { this.client = client; this.clusterService = clusterService; @@ -95,21 +94,21 @@ public void migrateData() { if (!dataMigrated.getAndSet(true)) { logger.info("Start migrating AD data"); - if (!detectionIndices.doesAnomalyDetectorJobIndexExist()) { + if (!detectionIndices.doesJobIndexExist()) { logger.info("AD job index doesn't exist, no need to migrate"); return; } - if (detectionIndices.doesDetectorStateIndexExist()) { + if (detectionIndices.doesStateIndexExist()) { migrateDetectorInternalStateToRealtimeTask(); } else { // If detection index doesn't exist, create index and backfill realtime task. - detectionIndices.initDetectionStateIndex(ActionListener.wrap(r -> { + detectionIndices.initStateIndex(ActionListener.wrap(r -> { if (r.isAcknowledged()) { - logger.info("Created {} with mappings.", CommonName.DETECTION_STATE_INDEX); + logger.info("Created {} with mappings.", ADCommonName.DETECTION_STATE_INDEX); migrateDetectorInternalStateToRealtimeTask(); } else { - String error = "Create index " + CommonName.DETECTION_STATE_INDEX + " with mappings not acknowledged"; + String error = "Create index " + ADCommonName.DETECTION_STATE_INDEX + " with mappings not acknowledged"; logger.warn(error); } }, e -> { @@ -132,19 +131,19 @@ public void migrateDetectorInternalStateToRealtimeTask() { SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder() .query(new MatchAllQueryBuilder()) .size(MAX_DETECTOR_UPPER_LIMIT); - SearchRequest searchRequest = new SearchRequest(ANOMALY_DETECTOR_JOB_INDEX).source(searchSourceBuilder); + SearchRequest searchRequest = new SearchRequest(CommonName.JOB_INDEX).source(searchSourceBuilder); client.search(searchRequest, ActionListener.wrap(r -> { if (r == null || r.getHits().getTotalHits() == null || r.getHits().getTotalHits().value == 0) { logger.info("No anomaly detector job found, no need to migrate"); return; } - ConcurrentLinkedQueue detectorJobs = new ConcurrentLinkedQueue<>(); + ConcurrentLinkedQueue detectorJobs = new ConcurrentLinkedQueue<>(); Iterator iterator = r.getHits().iterator(); while (iterator.hasNext()) { SearchHit searchHit = iterator.next(); try (XContentParser parser = createXContentParserFromRegistry(xContentRegistry, searchHit.getSourceRef())) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - AnomalyDetectorJob job = AnomalyDetectorJob.parse(parser); + Job job = Job.parse(parser); detectorJobs.add(job); } catch (IOException e) { logger.error("Fail to parse AD job " + searchHit.getId(), e); @@ -169,8 +168,8 @@ public void migrateDetectorInternalStateToRealtimeTask() { * @param detectorJobs realtime AD jobs * @param backfillAllJob backfill task for all realtime job or not */ - public void backfillRealtimeTask(ConcurrentLinkedQueue detectorJobs, boolean backfillAllJob) { - AnomalyDetectorJob job = detectorJobs.poll(); + public void backfillRealtimeTask(ConcurrentLinkedQueue detectorJobs, boolean backfillAllJob) { + Job job = detectorJobs.poll(); if (job == null) { logger.info("AD data migration done."); if (backfillAllJob) { @@ -180,7 +179,7 @@ public void backfillRealtimeTask(ConcurrentLinkedQueue detec } String jobId = job.getName(); - AnomalyDetectorFunction createRealtimeTaskFunction = () -> { + ExecutorFunction createRealtimeTaskFunction = () -> { GetRequest getRequest = new GetRequest(DETECTION_STATE_INDEX, jobId); client.get(getRequest, ActionListener.wrap(r -> { if (r != null && r.isExists()) { @@ -204,9 +203,9 @@ public void backfillRealtimeTask(ConcurrentLinkedQueue detec } private void checkIfRealtimeTaskExistsAndBackfill( - AnomalyDetectorJob job, - AnomalyDetectorFunction createRealtimeTaskFunction, - ConcurrentLinkedQueue detectorJobs, + Job job, + ExecutorFunction createRealtimeTaskFunction, + ConcurrentLinkedQueue detectorJobs, boolean migrateAll ) { String jobId = job.getName(); @@ -234,24 +233,19 @@ private void checkIfRealtimeTaskExistsAndBackfill( })); } - private void createRealtimeADTask( - AnomalyDetectorJob job, - String error, - ConcurrentLinkedQueue detectorJobs, - boolean migrateAll - ) { - client.get(new GetRequest(ANOMALY_DETECTORS_INDEX, job.getName()), ActionListener.wrap(r -> { + private void createRealtimeADTask(Job job, String error, ConcurrentLinkedQueue detectorJobs, boolean migrateAll) { + client.get(new GetRequest(CommonName.CONFIG_INDEX, job.getName()), ActionListener.wrap(r -> { if (r != null && r.isExists()) { try (XContentParser parser = createXContentParserFromRegistry(xContentRegistry, r.getSourceAsBytesRef())) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); AnomalyDetector detector = AnomalyDetector.parse(parser, r.getId()); - ADTaskType taskType = detector.isMultientityDetector() + ADTaskType taskType = detector.isHighCardinality() ? ADTaskType.REALTIME_HC_DETECTOR : ADTaskType.REALTIME_SINGLE_ENTITY; Instant now = Instant.now(); String userName = job.getUser() != null ? job.getUser().getName() : null; ADTask adTask = new ADTask.Builder() - .detectorId(detector.getDetectorId()) + .configId(detector.getId()) .detector(detector) .error(error) .isLatest(true) @@ -259,7 +253,7 @@ private void createRealtimeADTask( .executionStartTime(now) .taskProgress(0.0f) .initProgress(0.0f) - .state(ADTaskState.CREATED.name()) + .state(TaskState.CREATED.name()) .lastUpdateTime(now) .startedBy(userName) .coordinatingNode(null) diff --git a/src/main/java/org/opensearch/ad/cluster/ADVersionUtil.java b/src/main/java/org/opensearch/ad/cluster/ADVersionUtil.java index 35c1f7ec0..7e880de66 100644 --- a/src/main/java/org/opensearch/ad/cluster/ADVersionUtil.java +++ b/src/main/java/org/opensearch/ad/cluster/ADVersionUtil.java @@ -11,16 +11,15 @@ package org.opensearch.ad.cluster; -import static org.opensearch.ad.constant.CommonName.AD_PLUGIN_VERSION_FOR_TEST; - import org.opensearch.Version; +import org.opensearch.timeseries.constant.CommonName; public class ADVersionUtil { public static final int VERSION_SEGMENTS = 3; public static Version fromString(String adVersion) { - if (AD_PLUGIN_VERSION_FOR_TEST.equals(adVersion)) { + if (CommonName.TIME_SERIES_PLUGIN_VERSION_FOR_TEST.equals(adVersion)) { return Version.CURRENT; } return Version.fromString(normalizeVersion(adVersion)); @@ -42,8 +41,4 @@ public static String normalizeVersion(String adVersion) { } return normalizedVersion.toString(); } - - public static boolean compatibleWithVersionOnOrAfter1_1(Version adVersion) { - return adVersion != null && adVersion.onOrAfter(Version.V_1_1_0); - } } diff --git a/src/main/java/org/opensearch/ad/cluster/ClusterManagerEventListener.java b/src/main/java/org/opensearch/ad/cluster/ClusterManagerEventListener.java index 250652f47..8b8a40405 100644 --- a/src/main/java/org/opensearch/ad/cluster/ClusterManagerEventListener.java +++ b/src/main/java/org/opensearch/ad/cluster/ClusterManagerEventListener.java @@ -16,9 +16,7 @@ import org.opensearch.ad.cluster.diskcleanup.IndexCleanup; import org.opensearch.ad.cluster.diskcleanup.ModelCheckpointIndexRetention; -import org.opensearch.ad.util.ClientUtil; import org.opensearch.ad.util.DateUtils; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.Client; import org.opensearch.cluster.LocalNodeClusterManagerListener; import org.opensearch.cluster.service.ClusterService; @@ -28,6 +26,8 @@ import org.opensearch.common.unit.TimeValue; import org.opensearch.threadpool.Scheduler.Cancellable; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.util.ClientUtil; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; import com.google.common.annotations.VisibleForTesting; diff --git a/src/main/java/org/opensearch/ad/cluster/DailyCron.java b/src/main/java/org/opensearch/ad/cluster/DailyCron.java index a7f380dbb..2692608d2 100644 --- a/src/main/java/org/opensearch/ad/cluster/DailyCron.java +++ b/src/main/java/org/opensearch/ad/cluster/DailyCron.java @@ -17,14 +17,14 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.opensearch.action.support.IndicesOptions; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.ml.CheckpointDao; -import org.opensearch.ad.util.ClientUtil; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.action.ActionListener; import org.opensearch.index.IndexNotFoundException; import org.opensearch.index.query.QueryBuilders; import org.opensearch.index.reindex.DeleteByQueryAction; import org.opensearch.index.reindex.DeleteByQueryRequest; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.util.ClientUtil; @Deprecated public class DailyCron implements Runnable { @@ -46,15 +46,15 @@ public DailyCron(Clock clock, Duration checkpointTtl, ClientUtil clientUtil) { @Override public void run() { - DeleteByQueryRequest deleteRequest = new DeleteByQueryRequest(CommonName.CHECKPOINT_INDEX_NAME) + DeleteByQueryRequest deleteRequest = new DeleteByQueryRequest(ADCommonName.CHECKPOINT_INDEX_NAME) .setQuery( QueryBuilders .boolQuery() .filter( QueryBuilders - .rangeQuery(CheckpointDao.TIMESTAMP) + .rangeQuery(CommonName.TIMESTAMP) .lte(clock.millis() - checkpointTtl.toMillis()) - .format(CommonName.EPOCH_MILLIS_FORMAT) + .format(ADCommonName.EPOCH_MILLIS_FORMAT) ) ) .setIndicesOptions(IndicesOptions.LENIENT_EXPAND_OPEN); diff --git a/src/main/java/org/opensearch/ad/cluster/HashRing.java b/src/main/java/org/opensearch/ad/cluster/HashRing.java index 8d1813ea6..30ea1724f 100644 --- a/src/main/java/org/opensearch/ad/cluster/HashRing.java +++ b/src/main/java/org/opensearch/ad/cluster/HashRing.java @@ -11,9 +11,7 @@ package org.opensearch.ad.cluster; -import static org.opensearch.ad.constant.CommonName.AD_PLUGIN_NAME; -import static org.opensearch.ad.constant.CommonName.AD_PLUGIN_NAME_FOR_TEST; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.COOLDOWN_MINUTES; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_COOLDOWN_MINUTES; import java.time.Clock; import java.util.ArrayList; @@ -37,10 +35,7 @@ import org.opensearch.action.admin.cluster.node.info.NodeInfo; import org.opensearch.action.admin.cluster.node.info.NodesInfoRequest; import org.opensearch.action.admin.cluster.node.info.PluginsAndModules; -import org.opensearch.ad.common.exception.AnomalyDetectionException; import org.opensearch.ad.ml.ModelManager; -import org.opensearch.ad.ml.SingleStreamModelIdMapper; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.AdminClient; import org.opensearch.client.Client; import org.opensearch.client.ClusterAdminClient; @@ -54,6 +49,10 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.core.common.transport.TransportAddress; import org.opensearch.plugins.PluginInfo; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.ml.SingleStreamModelIdMapper; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; import com.google.common.collect.Sets; @@ -110,8 +109,8 @@ public HashRing( this.nodeFilter = nodeFilter; this.buildHashRingSemaphore = new Semaphore(1); this.clock = clock; - this.coolDownPeriodForRealtimeAD = COOLDOWN_MINUTES.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(COOLDOWN_MINUTES, it -> coolDownPeriodForRealtimeAD = it); + this.coolDownPeriodForRealtimeAD = AD_COOLDOWN_MINUTES.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_COOLDOWN_MINUTES, it -> coolDownPeriodForRealtimeAD = it); this.lastUpdateForRealtimeAD = 0; this.client = client; @@ -216,7 +215,7 @@ public void buildCirclesForRealtimeAD() { */ private void buildCircles(Set removedNodeIds, Set addedNodeIds, ActionListener actionListener) { if (buildHashRingSemaphore.availablePermits() != 0) { - throw new AnomalyDetectionException("Must get update hash ring semaphore before building AD hash ring"); + throw new TimeSeriesException("Must get update hash ring semaphore before building AD hash ring"); } try { DiscoveryNode localNode = clusterService.localNode(); @@ -265,7 +264,8 @@ private void buildCircles(Set removedNodeIds, Set addedNodeIds, } TreeMap circle = null; for (PluginInfo pluginInfo : plugins.getPluginInfos()) { - if (AD_PLUGIN_NAME.equals(pluginInfo.getName()) || AD_PLUGIN_NAME_FOR_TEST.equals(pluginInfo.getName())) { + if (CommonName.TIME_SERIES_PLUGIN_NAME.equals(pluginInfo.getName()) + || CommonName.TIME_SERIES_PLUGIN_NAME_FOR_TEST.equals(pluginInfo.getName())) { Version version = ADVersionUtil.fromString(pluginInfo.getVersion()); boolean eligibleNode = nodeFilter.isEligibleNode(curNode); if (eligibleNode) { @@ -288,7 +288,7 @@ private void buildCircles(Set removedNodeIds, Set addedNodeIds, // rebuild AD version hash ring with cooldown after all new node added. rebuildCirclesForRealtimeAD(); - if (!dataMigrator.isMigrated() && circles.size() > 0 && circles.lastEntry().getKey().onOrAfter(Version.V_1_1_0)) { + if (!dataMigrator.isMigrated() && circles.size() > 0) { // Find owning node with highest AD version to make sure the data migration logic be compatible to // latest AD version when upgrade. Optional owningNode = getOwningNodeWithHighestAdVersion(DEFAULT_HASH_RING_MODEL_ID); @@ -383,7 +383,7 @@ private void rebuildCirclesForRealtimeAD() { * 1. There is node change event not consumed, and * 2. Have passed cool down period from last hash ring update time. * - * Check {@link org.opensearch.ad.settings.AnomalyDetectorSettings#COOLDOWN_MINUTES} about + * Check {@link org.opensearch.ad.settings.AnomalyDetectorSettings#AD_COOLDOWN_MINUTES} about * cool down settings. * * Why we need to wait for some cooldown period before rebuilding hash ring? diff --git a/src/main/java/org/opensearch/ad/cluster/HourlyCron.java b/src/main/java/org/opensearch/ad/cluster/HourlyCron.java index ec9a5acd8..687aace69 100644 --- a/src/main/java/org/opensearch/ad/cluster/HourlyCron.java +++ b/src/main/java/org/opensearch/ad/cluster/HourlyCron.java @@ -16,10 +16,10 @@ import org.opensearch.action.FailedNodeException; import org.opensearch.ad.transport.CronAction; import org.opensearch.ad.transport.CronRequest; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.Client; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.core.action.ActionListener; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; public class HourlyCron implements Runnable { private static final Logger LOG = LogManager.getLogger(HourlyCron.class); diff --git a/src/main/java/org/opensearch/ad/cluster/diskcleanup/IndexCleanup.java b/src/main/java/org/opensearch/ad/cluster/diskcleanup/IndexCleanup.java index 02282c4ab..bd37127cb 100644 --- a/src/main/java/org/opensearch/ad/cluster/diskcleanup/IndexCleanup.java +++ b/src/main/java/org/opensearch/ad/cluster/diskcleanup/IndexCleanup.java @@ -21,7 +21,6 @@ import org.opensearch.action.admin.indices.stats.IndicesStatsResponse; import org.opensearch.action.admin.indices.stats.ShardStats; import org.opensearch.action.support.IndicesOptions; -import org.opensearch.ad.util.ClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.util.concurrent.ThreadContext; @@ -30,6 +29,7 @@ import org.opensearch.index.reindex.DeleteByQueryAction; import org.opensearch.index.reindex.DeleteByQueryRequest; import org.opensearch.index.store.StoreStats; +import org.opensearch.timeseries.util.ClientUtil; /** * Clean up the old docs for indices. diff --git a/src/main/java/org/opensearch/ad/cluster/diskcleanup/ModelCheckpointIndexRetention.java b/src/main/java/org/opensearch/ad/cluster/diskcleanup/ModelCheckpointIndexRetention.java index 0f56e0805..28fc05e37 100644 --- a/src/main/java/org/opensearch/ad/cluster/diskcleanup/ModelCheckpointIndexRetention.java +++ b/src/main/java/org/opensearch/ad/cluster/diskcleanup/ModelCheckpointIndexRetention.java @@ -16,11 +16,11 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.ml.CheckpointDao; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.action.ActionListener; import org.opensearch.index.IndexNotFoundException; import org.opensearch.index.query.QueryBuilders; +import org.opensearch.timeseries.constant.CommonName; /** * Model checkpoints cleanup of multi-entity detectors. @@ -57,14 +57,14 @@ public ModelCheckpointIndexRetention(Duration defaultCheckpointTtl, Clock clock, public void run() { indexCleanup .deleteDocsByQuery( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, QueryBuilders .boolQuery() .filter( QueryBuilders - .rangeQuery(CheckpointDao.TIMESTAMP) + .rangeQuery(CommonName.TIMESTAMP) .lte(clock.millis() - defaultCheckpointTtl.toMillis()) - .format(CommonName.EPOCH_MILLIS_FORMAT) + .format(ADCommonName.EPOCH_MILLIS_FORMAT) ), ActionListener.wrap(response -> { cleanupBasedOnShardSize(defaultCheckpointTtl.minusDays(1)); @@ -79,15 +79,15 @@ public void run() { private void cleanupBasedOnShardSize(Duration cleanUpTtl) { indexCleanup .deleteDocsBasedOnShardSize( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, MAX_SHARD_SIZE_IN_BYTE, QueryBuilders .boolQuery() .filter( QueryBuilders - .rangeQuery(CheckpointDao.TIMESTAMP) + .rangeQuery(CommonName.TIMESTAMP) .lte(clock.millis() - cleanUpTtl.toMillis()) - .format(CommonName.EPOCH_MILLIS_FORMAT) + .format(ADCommonName.EPOCH_MILLIS_FORMAT) ), ActionListener.wrap(cleanupNeeded -> { if (cleanupNeeded) { diff --git a/src/main/java/org/opensearch/ad/common/exception/ClientException.java b/src/main/java/org/opensearch/ad/common/exception/ClientException.java deleted file mode 100644 index bce5dc288..000000000 --- a/src/main/java/org/opensearch/ad/common/exception/ClientException.java +++ /dev/null @@ -1,34 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.common.exception; - -/** - * All exception visible to AD transport layer's client is under ClientException. - */ -public class ClientException extends AnomalyDetectionException { - - public ClientException(String message) { - super(message); - } - - public ClientException(String anomalyDetectorId, String message) { - super(anomalyDetectorId, message); - } - - public ClientException(String anomalyDetectorId, String message, Throwable throwable) { - super(anomalyDetectorId, message, throwable); - } - - public ClientException(String anomalyDetectorId, Throwable cause) { - super(anomalyDetectorId, cause); - } -} diff --git a/src/main/java/org/opensearch/ad/common/exception/InternalFailure.java b/src/main/java/org/opensearch/ad/common/exception/InternalFailure.java deleted file mode 100644 index dc192f65a..000000000 --- a/src/main/java/org/opensearch/ad/common/exception/InternalFailure.java +++ /dev/null @@ -1,35 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.common.exception; - -/** - * Exception for root cause unknown failure. Maybe transient. Client can continue the detector running. - * - */ -public class InternalFailure extends ClientException { - - public InternalFailure(String anomalyDetectorId, String message) { - super(anomalyDetectorId, message); - } - - public InternalFailure(String anomalyDetectorId, String message, Throwable cause) { - super(anomalyDetectorId, message, cause); - } - - public InternalFailure(String anomalyDetectorId, Throwable cause) { - super(anomalyDetectorId, cause); - } - - public InternalFailure(AnomalyDetectionException cause) { - super(cause.getAnomalyDetectorId(), cause); - } -} diff --git a/src/main/java/org/opensearch/ad/constant/ADCommonMessages.java b/src/main/java/org/opensearch/ad/constant/ADCommonMessages.java new file mode 100644 index 000000000..1f186f647 --- /dev/null +++ b/src/main/java/org/opensearch/ad/constant/ADCommonMessages.java @@ -0,0 +1,55 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.ad.constant; + +import static org.opensearch.ad.constant.ADCommonName.CUSTOM_RESULT_INDEX_PREFIX; + +public class ADCommonMessages { + public static final String AD_ID_MISSING_MSG = "AD ID is missing"; + public static final String MODEL_ID_MISSING_MSG = "Model ID is missing"; + public static final String HASH_ERR_MSG = "Cannot find an RCF node. Hashing does not work."; + public static final String NO_CHECKPOINT_ERR_MSG = "No checkpoints found for model id "; + public static final String FEATURE_NOT_AVAILABLE_ERR_MSG = "No Feature in current detection window."; + public static final String DISABLED_ERR_MSG = + "AD functionality is disabled. To enable update plugins.anomaly_detection.enabled to true"; + public static String CATEGORICAL_FIELD_NUMBER_SURPASSED = "We don't support categorical fields more than "; + public static String DETECTOR_IS_RUNNING = "Detector is already running"; + public static String DETECTOR_MISSING = "Detector is missing"; + public static String AD_TASK_ACTION_MISSING = "AD task action is missing"; + public static final String INDEX_NOT_FOUND = "index does not exist"; + public static final String NOT_EXISTENT_VALIDATION_TYPE = "The given validation type doesn't exist"; + public static final String UNSUPPORTED_PROFILE_TYPE = "Unsupported profile types"; + + public static final String REQUEST_THROTTLED_MSG = "Request throttled. Please try again later."; + public static String NULL_DETECTION_INTERVAL = "Detection interval should be set"; + public static String INVALID_SHINGLE_SIZE = "Shingle size must be a positive integer"; + public static String INVALID_DETECTION_INTERVAL = "Detection interval must be a positive integer"; + public static String EXCEED_HISTORICAL_ANALYSIS_LIMIT = "Exceed max historical analysis limit per node"; + public static String NO_ELIGIBLE_NODE_TO_RUN_DETECTOR = "No eligible node to run detector "; + public static String EMPTY_STALE_RUNNING_ENTITIES = "Empty stale running entities"; + public static String NO_ENTITY_FOUND = "No entity found"; + public static String HISTORICAL_ANALYSIS_CANCELLED = "Historical analysis cancelled by user"; + public static String HC_DETECTOR_TASK_IS_UPDATING = "HC detector task is updating"; + public static String INVALID_TIME_CONFIGURATION_UNITS = "Time unit %s is not supported"; + public static String FAIL_TO_GET_DETECTOR = "Fail to get detector"; + public static String FAIL_TO_CREATE_DETECTOR = "Fail to create detector"; + public static String FAIL_TO_UPDATE_DETECTOR = "Fail to update detector"; + public static String FAIL_TO_PREVIEW_DETECTOR = "Fail to preview detector"; + public static String FAIL_TO_START_DETECTOR = "Fail to start detector"; + public static String FAIL_TO_STOP_DETECTOR = "Fail to stop detector"; + public static String FAIL_TO_DELETE_DETECTOR = "Fail to delete detector"; + public static String FAIL_TO_DELETE_AD_RESULT = "Fail to delete anomaly result"; + public static final String NO_MODEL_ERR_MSG = "No RCF models are available either because RCF" + + " models are not ready or all nodes are unresponsive or the system might have bugs."; + public static String INVALID_RESULT_INDEX_PREFIX = "Result index must start with " + CUSTOM_RESULT_INDEX_PREFIX; + +} diff --git a/src/main/java/org/opensearch/ad/constant/CommonName.java b/src/main/java/org/opensearch/ad/constant/ADCommonName.java similarity index 69% rename from src/main/java/org/opensearch/ad/constant/CommonName.java rename to src/main/java/org/opensearch/ad/constant/ADCommonName.java index dd270ae42..9f7f39759 100644 --- a/src/main/java/org/opensearch/ad/constant/CommonName.java +++ b/src/main/java/org/opensearch/ad/constant/ADCommonName.java @@ -11,9 +11,9 @@ package org.opensearch.ad.constant; -import org.opensearch.ad.stats.StatNames; +import org.opensearch.timeseries.stats.StatNames; -public class CommonName { +public class ADCommonName { // ====================================== // Index name // ====================================== @@ -21,7 +21,6 @@ public class CommonName { public static final String CHECKPOINT_INDEX_NAME = ".opendistro-anomaly-checkpoints"; // index name for anomaly detection state. Will store AD task in this index as well. public static final String DETECTION_STATE_INDEX = ".opendistro-anomaly-detection-state"; - // TODO: move other index name here // The alias of the index in which to write AD result history public static final String ANOMALY_RESULT_INDEX_ALIAS = ".opendistro-anomaly-results"; @@ -37,9 +36,6 @@ public class CommonName { // Anomaly Detector name for X-Opaque-Id header // ====================================== public static final String ANOMALY_DETECTOR = "[Anomaly Detector]"; - public static final String AD_PLUGIN_NAME = "opensearch-anomaly-detection"; - public static final String AD_PLUGIN_NAME_FOR_TEST = "org.opensearch.ad.AnomalyDetectorPlugin"; - public static final String AD_PLUGIN_VERSION_FOR_TEST = "NA"; // ====================================== // Ultrawarm node attributes @@ -65,9 +61,7 @@ public class CommonName { public static final String MODELS = "models"; public static final String MODEL = "model"; public static final String INIT_PROGRESS = "init_progress"; - public static final String MODEL_SIZE_IN_BYTES = "model_size_in_bytes"; public static final String CATEGORICAL_FIELD = "category_field"; - public static final String TOTAL_ENTITIES = "total_entities"; public static final String ACTIVE_ENTITIES = "active_entities"; public static final String ENTITY_INFO = "entity_info"; @@ -82,38 +76,9 @@ public class CommonName { public static final String CANCEL_TASK = "cancel_task"; // ====================================== - // Index mapping - // ====================================== - // Elastic mapping type - public static final String MAPPING_TYPE = "_doc"; - - // Used to fetch mapping - public static final String TYPE = "type"; - public static final String KEYWORD_TYPE = "keyword"; - public static final String IP_TYPE = "ip"; - public static final String DATE_TYPE = "date"; - - // used for updating mapping - public static final String SCHEMA_VERSION_FIELD = "schema_version"; - - // ====================================== - // Query + // Used in stats API // ====================================== - // Used in finding the max timestamp - public static final String AGG_NAME_MAX_TIME = "max_timefield"; - // Used in finding the min timestamp - public static final String AGG_NAME_MIN_TIME = "min_timefield"; - // date histogram aggregation name - public static final String DATE_HISTOGRAM = "date_histogram"; - // feature aggregation name - public static final String FEATURE_AGGS = "feature_aggs"; - - // ====================================== - // Used in almost all components - // ====================================== - public static final String MODEL_ID_KEY = "model_id"; public static final String DETECTOR_ID_KEY = "detector_id"; - public static final String ENTITY_KEY = "entity"; // ====================================== // Used in toXContent @@ -124,11 +89,6 @@ public class CommonName { public static final String CONFIDENCE_JSON_KEY = "confidence"; public static final String ANOMALY_GRADE_JSON_KEY = "anomalyGrade"; public static final String QUEUE_JSON_KEY = "queue"; - public static final String START_JSON_KEY = "start"; - public static final String END_JSON_KEY = "end"; - public static final String VALUE_JSON_KEY = "value"; - public static final String ENTITIES_JSON_KEY = "entities"; - // ====================================== // Used for backward-compatibility in messaging // ====================================== @@ -138,13 +98,10 @@ public class CommonName { // ====================================== // detector validation aspect public static final String DETECTOR_ASPECT = "detector"; - public static final String MODEL_ASPECT = "model"; - // ====================================== // Used for custom AD result index // ====================================== public static final String DUMMY_AD_RESULT_ID = "dummy_ad_result_id"; public static final String DUMMY_DETECTOR_ID = "dummy_detector_id"; public static final String CUSTOM_RESULT_INDEX_PREFIX = "opensearch-ad-plugin-result-"; - public static final String PROPERTIES = "properties"; } diff --git a/src/main/java/org/opensearch/ad/constant/CommonErrorMessages.java b/src/main/java/org/opensearch/ad/constant/CommonErrorMessages.java deleted file mode 100644 index 5870a1278..000000000 --- a/src/main/java/org/opensearch/ad/constant/CommonErrorMessages.java +++ /dev/null @@ -1,137 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.constant; - -import static org.opensearch.ad.constant.CommonName.CUSTOM_RESULT_INDEX_PREFIX; -import static org.opensearch.ad.model.AnomalyDetector.MAX_RESULT_INDEX_NAME_SIZE; -import static org.opensearch.ad.rest.handler.AbstractAnomalyDetectorActionHandler.MAX_DETECTOR_NAME_SIZE; - -import java.util.Locale; - -public class CommonErrorMessages { - public static final String AD_ID_MISSING_MSG = "AD ID is missing"; - public static final String MODEL_ID_MISSING_MSG = "Model ID is missing"; - public static final String WAIT_ERR_MSG = "Exception in waiting for result"; - public static final String HASH_ERR_MSG = "Cannot find an RCF node. Hashing does not work."; - public static final String NO_CHECKPOINT_ERR_MSG = "No checkpoints found for model id "; - public static final String MEMORY_LIMIT_EXCEEDED_ERR_MSG = "AD models memory usage exceeds our limit."; - public static final String FEATURE_NOT_AVAILABLE_ERR_MSG = "No Feature in current detection window."; - public static final String MEMORY_CIRCUIT_BROKEN_ERR_MSG = "AD memory circuit is broken."; - public static final String DISABLED_ERR_MSG = "AD plugin is disabled. To enable update plugins.anomaly_detection.enabled to true"; - public static final String CAN_NOT_CHANGE_CATEGORY_FIELD = "Can't change detector category field"; - public static final String CAN_NOT_CHANGE_RESULT_INDEX = "Can't change detector result index"; - public static final String CREATE_INDEX_NOT_ACKNOWLEDGED = "Create index %S not acknowledged"; - // We need this invalid query tag to show proper error message on frontend - // refer to AD Dashboard code: https://tinyurl.com/8b5n8hat - public static final String INVALID_SEARCH_QUERY_MSG = "Invalid search query."; - public static final String ALL_FEATURES_DISABLED_ERR_MSG = - "Having trouble querying data because all of your features have been disabled."; - public static final String INVALID_TIMESTAMP_ERR_MSG = "timestamp is invalid"; - public static String FAIL_TO_PARSE_DETECTOR_MSG = "Fail to parse detector with id: "; - // change this error message to make it compatible with old version's integration(nexus) test - public static String FAIL_TO_FIND_DETECTOR_MSG = "Can't find detector with id: "; - public static String FAIL_TO_GET_PROFILE_MSG = "Fail to get profile for detector "; - public static String FAIL_TO_GET_TOTAL_ENTITIES = "Failed to get total entities for detector "; - public static String FAIL_TO_GET_USER_INFO = "Unable to get user information from detector "; - public static String NO_PERMISSION_TO_ACCESS_DETECTOR = "User does not have permissions to access detector: "; - public static String CATEGORICAL_FIELD_NUMBER_SURPASSED = "We don't support categorical fields more than "; - public static String EMPTY_PROFILES_COLLECT = "profiles to collect are missing or invalid"; - public static String FAIL_FETCH_ERR_MSG = "Fail to fetch profile for "; - public static String DETECTOR_IS_RUNNING = "Detector is already running"; - public static String DETECTOR_MISSING = "Detector is missing"; - public static String AD_TASK_ACTION_MISSING = "AD task action is missing"; - public static final String BUG_RESPONSE = "We might have bugs."; - public static final String INDEX_NOT_FOUND = "index does not exist"; - public static final String NOT_EXISTENT_VALIDATION_TYPE = "The given validation type doesn't exist"; - public static final String UNSUPPORTED_PROFILE_TYPE = "Unsupported profile types"; - - private static final String TOO_MANY_CATEGORICAL_FIELD_ERR_MSG_FORMAT = "We can have only %d categorical field/s."; - - public static String getTooManyCategoricalFieldErr(int limit) { - return String.format(Locale.ROOT, TOO_MANY_CATEGORICAL_FIELD_ERR_MSG_FORMAT, limit); - } - - public static final String REQUEST_THROTTLED_MSG = "Request throttled. Please try again later."; - public static String EMPTY_DETECTOR_NAME = "Detector name should be set"; - public static String NULL_TIME_FIELD = "Time field should be set"; - public static String EMPTY_INDICES = "Indices should be set"; - public static String NULL_DETECTION_INTERVAL = "Detection interval should be set"; - public static String INVALID_SHINGLE_SIZE = "Shingle size must be a positive integer"; - public static String INVALID_DETECTION_INTERVAL = "Detection interval must be a positive integer"; - public static String EXCEED_HISTORICAL_ANALYSIS_LIMIT = "Exceed max historical analysis limit per node"; - public static String NO_ELIGIBLE_NODE_TO_RUN_DETECTOR = "No eligible node to run detector "; - public static String EMPTY_STALE_RUNNING_ENTITIES = "Empty stale running entities"; - public static String CAN_NOT_FIND_LATEST_TASK = "can't find latest task"; - public static String NO_ENTITY_FOUND = "No entity found"; - public static String HISTORICAL_ANALYSIS_CANCELLED = "Historical analysis cancelled by user"; - public static String HC_DETECTOR_TASK_IS_UPDATING = "HC detector task is updating"; - public static String NEGATIVE_TIME_CONFIGURATION = "should be non-negative"; - public static String INVALID_TIME_CONFIGURATION_UNITS = "Time unit %s is not supported"; - public static String INVALID_DETECTOR_NAME = - "Valid characters for detector name are a-z, A-Z, 0-9, -(hyphen), _(underscore) and .(period)"; - public static String DUPLICATE_FEATURE_AGGREGATION_NAMES = "Detector has duplicate feature aggregation query names: "; - public static String INVALID_TIMESTAMP = "Timestamp field: (%s) must be of type date"; - public static String NON_EXISTENT_TIMESTAMP = "Timestamp field: (%s) is not found in index mapping"; - - public static String FAIL_TO_GET_DETECTOR = "Fail to get detector"; - public static String FAIL_TO_GET_DETECTOR_INFO = "Fail to get detector info"; - public static String FAIL_TO_CREATE_DETECTOR = "Fail to create detector"; - public static String FAIL_TO_UPDATE_DETECTOR = "Fail to update detector"; - public static String FAIL_TO_PREVIEW_DETECTOR = "Fail to preview detector"; - public static String FAIL_TO_START_DETECTOR = "Fail to start detector"; - public static String FAIL_TO_STOP_DETECTOR = "Fail to stop detector"; - public static String FAIL_TO_DELETE_DETECTOR = "Fail to delete detector"; - public static String FAIL_TO_DELETE_AD_RESULT = "Fail to delete anomaly result"; - public static String FAIL_TO_GET_STATS = "Fail to get stats"; - public static String FAIL_TO_SEARCH = "Fail to search"; - - public static String CAN_NOT_FIND_RESULT_INDEX = "Can't find result index "; - public static String INVALID_RESULT_INDEX_PREFIX = "Result index must start with " + CUSTOM_RESULT_INDEX_PREFIX; - public static String INVALID_RESULT_INDEX_NAME_SIZE = "Result index name size must contains less than " - + MAX_RESULT_INDEX_NAME_SIZE - + " characters"; - public static String INVALID_CHAR_IN_RESULT_INDEX_NAME = - "Result index name has invalid character. Valid characters are a-z, 0-9, -(hyphen) and _(underscore)"; - public static String INVALID_RESULT_INDEX_MAPPING = "Result index mapping is not correct for index: "; - public static String INVALID_DETECTOR_NAME_SIZE = "Name should be shortened. The maximum limit is " - + MAX_DETECTOR_NAME_SIZE - + " characters."; - - public static String WINDOW_DELAY_REC = - "Latest seen data point is at least %d minutes ago, consider changing window delay to at least %d minutes."; - public static String TIME_FIELD_NOT_ENOUGH_HISTORICAL_DATA = - "There isn't enough historical data found with current timefield selected."; - public static String DETECTOR_INTERVAL_REC = - "The selected detector interval might collect sparse data. Consider changing interval length to: "; - public static String RAW_DATA_TOO_SPARSE = - "Source index data is potentially too sparse for model training. Consider changing interval length or ingesting more data"; - public static String MODEL_VALIDATION_FAILED_UNEXPECTEDLY = "Model validation experienced issues completing."; - public static String FILTER_QUERY_TOO_SPARSE = "Data is too sparse after data filter is applied. Consider changing the data filter"; - public static String CATEGORY_FIELD_TOO_SPARSE = - "Data is most likely too sparse with the given category fields. Consider revising category field/s or ingesting more data "; - public static String CATEGORY_FIELD_NO_DATA = - "No entity was found with the given categorical fields. Consider revising category field/s or ingesting more data"; - public static String FEATURE_QUERY_TOO_SPARSE = - "Data is most likely too sparse when given feature queries are applied. Consider revising feature queries."; - public static String TIMEOUT_ON_INTERVAL_REC = "Timed out getting interval recommendation"; - - // Modifying message for FEATURE below may break the parseADValidationException method of ValidateAnomalyDetectorTransportAction - public static final String FEATURE_INVALID_MSG_PREFIX = "Feature has an invalid query"; - public static final String FEATURE_WITH_EMPTY_DATA_MSG = FEATURE_INVALID_MSG_PREFIX + " returning empty aggregated data: "; - public static final String FEATURE_WITH_INVALID_QUERY_MSG = FEATURE_INVALID_MSG_PREFIX + " causing a runtime exception: "; - public static final String UNKNOWN_SEARCH_QUERY_EXCEPTION_MSG = - "Feature has an unknown exception caught while executing the feature query: "; - public static final String VALIDATION_FEATURE_FAILURE = "Validation failed for feature(s) of detector %s"; - public static final String NO_MODEL_ERR_MSG = "No RCF models are available either because RCF" - + " models are not ready or all nodes are unresponsive or the system might have bugs."; - -} diff --git a/src/main/java/org/opensearch/ad/dataprocessor/IntegerSensitiveSingleFeatureLinearUniformInterpolator.java b/src/main/java/org/opensearch/ad/dataprocessor/IntegerSensitiveSingleFeatureLinearUniformInterpolator.java deleted file mode 100644 index cc187d7f8..000000000 --- a/src/main/java/org/opensearch/ad/dataprocessor/IntegerSensitiveSingleFeatureLinearUniformInterpolator.java +++ /dev/null @@ -1,40 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.dataprocessor; - -import java.util.Arrays; - -import com.google.common.math.DoubleMath; - -/** - * Interpolator sensitive to integral values. - */ -public class IntegerSensitiveSingleFeatureLinearUniformInterpolator extends SingleFeatureLinearUniformInterpolator { - - /** - * Interpolates integral/floating-point results. - * - * If all samples are integral, the results are integral. - * Else, the results are floating points. - * - * @param samples integral/floating-point samples - * @param numInterpolants the number of interpolants - * @return {code numInterpolants} interpolated results - */ - public double[] interpolate(double[] samples, int numInterpolants) { - double[] interpolants = super.interpolate(samples, numInterpolants); - if (Arrays.stream(samples).allMatch(DoubleMath::isMathematicalInteger)) { - interpolants = Arrays.stream(interpolants).map(Math::rint).toArray(); - } - return interpolants; - } -} diff --git a/src/main/java/org/opensearch/ad/dataprocessor/Interpolator.java b/src/main/java/org/opensearch/ad/dataprocessor/Interpolator.java deleted file mode 100644 index 752498e88..000000000 --- a/src/main/java/org/opensearch/ad/dataprocessor/Interpolator.java +++ /dev/null @@ -1,37 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.dataprocessor; - -/* - * An object for interpolating feature vectors. - * - * In certain situations, due to time and compute cost, we are only allowed to - * query a sparse sample of data points / feature vectors from a cluster. - * However, we need a large sample of feature vectors in order to train our - * anomaly detection algorithms. An Interpolator approximates the data points - * between a given, ordered list of samples. - */ -public interface Interpolator { - - /* - * Interpolates the given sample feature vectors. - * - * Computes a list `numInterpolants` feature vectors using the ordered list - * of `numSamples` input sample vectors where each sample vector has size - * `numFeatures`. - * - * @param samples A `numFeatures x numSamples` list of feature vectors. - * @param numInterpolants The desired number of interpolating vectors. - * @return A `numFeatures x numInterpolants` list of feature vectors. - */ - double[][] interpolate(double[][] samples, int numInterpolants); -} diff --git a/src/main/java/org/opensearch/ad/dataprocessor/LinearUniformInterpolator.java b/src/main/java/org/opensearch/ad/dataprocessor/LinearUniformInterpolator.java deleted file mode 100644 index b62aeda9b..000000000 --- a/src/main/java/org/opensearch/ad/dataprocessor/LinearUniformInterpolator.java +++ /dev/null @@ -1,57 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.dataprocessor; - -/* - * A piecewise linear interpolator with uniformly spaced points. - * - * The LinearUniformInterpolator constructs a piecewise linear interpolation on - * the input list of sample feature vectors. That is, between every consecutive - * pair of points we construct a linear interpolation. The linear interpolation - * is computed on a per-feature basis. - * - * This class uses the helper class SingleFeatureLinearUniformInterpolator to - * compute per-feature interpolants. - * - * @see SingleFeatureLinearUniformInterpolator - */ -public class LinearUniformInterpolator implements Interpolator { - - private SingleFeatureLinearUniformInterpolator singleFeatureLinearUniformInterpolator; - - public LinearUniformInterpolator(SingleFeatureLinearUniformInterpolator singleFeatureLinearUniformInterpolator) { - this.singleFeatureLinearUniformInterpolator = singleFeatureLinearUniformInterpolator; - } - - /* - * Piecewise linearly interpolates the given sample feature vectors. - * - * Computes a list `numInterpolants` feature vectors using the ordered list - * of `numSamples` input sample vectors where each sample vector has size - * `numFeatures`. The feature vectors are computing using a piecewise linear - * interpolation. - * - * @param samples A `numFeatures x numSamples` list of feature vectors. - * @param numInterpolants The desired number of interpolating feature vectors. - * @return A `numFeatures x numInterpolants` list of feature vectors. - * @see SingleFeatureLinearUniformInterpolator - */ - public double[][] interpolate(double[][] samples, int numInterpolants) { - int numFeatures = samples.length; - double[][] interpolants = new double[numFeatures][numInterpolants]; - - for (int featureIndex = 0; featureIndex < numFeatures; featureIndex++) { - interpolants[featureIndex] = this.singleFeatureLinearUniformInterpolator.interpolate(samples[featureIndex], numInterpolants); - } - return interpolants; - } -} diff --git a/src/main/java/org/opensearch/ad/dataprocessor/SingleFeatureLinearUniformInterpolator.java b/src/main/java/org/opensearch/ad/dataprocessor/SingleFeatureLinearUniformInterpolator.java deleted file mode 100644 index 6349f29d3..000000000 --- a/src/main/java/org/opensearch/ad/dataprocessor/SingleFeatureLinearUniformInterpolator.java +++ /dev/null @@ -1,75 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.dataprocessor; - -import java.util.Arrays; - -/* - * A piecewise linear interpolator in a single feature dimension. - * - * A utility class for LinearUniformInterpolator. Constructs uniformly spaced - * piecewise linear interpolations within a single feature dimension. - * - * @see LinearUniformInterpolator - */ -public class SingleFeatureLinearUniformInterpolator { - - /* - * Piecewise linearly interpolates the given sample of one-dimensional - * features. - * - * Computes a list `numInterpolants` features using the ordered list of - * `numSamples` input one-dimensional samples. The interpolant features are - * computing using a piecewise linear interpolation. - * - * @param samples A `numSamples` sized list of sample features. - * @param numInterpolants The desired number of interpolating features. - * @return A `numInterpolants` sized array of interpolant features. - * @see LinearUniformInterpolator - */ - public double[] interpolate(double[] samples, int numInterpolants) { - int numSamples = samples.length; - double[] interpolants = new double[numInterpolants]; - - if (numSamples == 0) { - interpolants = new double[0]; - } else if (numSamples == 1) { - Arrays.fill(interpolants, samples[0]); - } else { - /* assume the piecewise linear interpolation between the samples is a - parameterized curve f(t) for t in [0, 1]. Each pair of samples - determines a interval [t_i, t_(i+1)]. For each interpolant we determine - which interval it lies inside and then scale the value of t, - accordingly to compute the interpolant value. - - for numerical stability reasons we omit processing the final - interpolant in this loop since this last interpolant is always equal - to the last sample. - */ - for (int interpolantIndex = 0; interpolantIndex < (numInterpolants - 1); interpolantIndex++) { - double tGlobal = ((double) interpolantIndex) / (numInterpolants - 1.0); - double tInterval = tGlobal * (numSamples - 1.0); - int intervalIndex = (int) Math.floor(tInterval); - tInterval -= intervalIndex; - - double leftSample = samples[intervalIndex]; - double rightSample = samples[intervalIndex + 1]; - double interpolant = (1.0 - tInterval) * leftSample + tInterval * rightSample; - interpolants[interpolantIndex] = interpolant; - } - - // the final interpolant is always the final sample - interpolants[numInterpolants - 1] = samples[numSamples - 1]; - } - return interpolants; - } -} diff --git a/src/main/java/org/opensearch/ad/feature/AbstractRetriever.java b/src/main/java/org/opensearch/ad/feature/AbstractRetriever.java index 3a9a31d2d..886dbcbc4 100644 --- a/src/main/java/org/opensearch/ad/feature/AbstractRetriever.java +++ b/src/main/java/org/opensearch/ad/feature/AbstractRetriever.java @@ -18,7 +18,6 @@ import java.util.Map; import java.util.Optional; -import org.opensearch.ad.common.exception.EndRunException; import org.opensearch.search.aggregations.Aggregation; import org.opensearch.search.aggregations.AggregationBuilder; import org.opensearch.search.aggregations.Aggregations; @@ -30,6 +29,7 @@ import org.opensearch.search.aggregations.metrics.NumericMetricsAggregation.SingleValue; import org.opensearch.search.aggregations.metrics.Percentile; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.common.exception.EndRunException; public abstract class AbstractRetriever { protected double parseAggregation(Aggregation aggregationToParse) { diff --git a/src/main/java/org/opensearch/ad/feature/CompositeRetriever.java b/src/main/java/org/opensearch/ad/feature/CompositeRetriever.java index 73bc93c99..3c9a1632a 100644 --- a/src/main/java/org/opensearch/ad/feature/CompositeRetriever.java +++ b/src/main/java/org/opensearch/ad/feature/CompositeRetriever.java @@ -28,10 +28,6 @@ import org.opensearch.action.search.SearchResponse; import org.opensearch.action.support.IndicesOptions; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.Feature; -import org.opensearch.ad.util.ParseUtils; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.metadata.IndexNameExpressionResolver; import org.opensearch.cluster.service.ClusterService; @@ -49,6 +45,11 @@ import org.opensearch.search.aggregations.bucket.composite.CompositeAggregationBuilder; import org.opensearch.search.aggregations.bucket.composite.TermsValuesSourceBuilder; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.util.ParseUtils; +import org.opensearch.timeseries.util.SecurityClientUtil; /** * @@ -157,7 +158,7 @@ public PageIterator iterator() throws IOException { CompositeAggregationBuilder composite = AggregationBuilders .composite( AGG_NAME_COMP, - anomalyDetector.getCategoryField().stream().map(f -> new TermsValuesSourceBuilder(f).field(f)).collect(Collectors.toList()) + anomalyDetector.getCategoryFields().stream().map(f -> new TermsValuesSourceBuilder(f).field(f)).collect(Collectors.toList()) ) .size(pageSize); for (Feature feature : anomalyDetector.getFeatureAttributes()) { @@ -218,8 +219,9 @@ public void onFailure(Exception e) { .asyncRequestWithInjectedSecurity( searchRequest, client::search, - anomalyDetector.getDetectorId(), + anomalyDetector.getId(), client, + AnalysisType.AD, searchResponseListener ); } diff --git a/src/main/java/org/opensearch/ad/feature/FeatureManager.java b/src/main/java/org/opensearch/ad/feature/FeatureManager.java index e882d643d..469f8707e 100644 --- a/src/main/java/org/opensearch/ad/feature/FeatureManager.java +++ b/src/main/java/org/opensearch/ad/feature/FeatureManager.java @@ -39,14 +39,16 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.opensearch.action.support.ThreadedActionListener; -import org.opensearch.ad.CleanState; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.dataprocessor.Interpolator; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; import org.opensearch.core.action.ActionListener; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.CleanState; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.dataprocessor.Imputer; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.model.Entity; /** * A facade managing feature data operations and buffers. @@ -59,7 +61,7 @@ public class FeatureManager implements CleanState { private final Map>>> detectorIdsToTimeShingles; private final SearchFeatureDao searchFeatureDao; - private final Interpolator interpolator; + private final Imputer imputer; private final Clock clock; private final int maxTrainSamples; @@ -78,7 +80,7 @@ public class FeatureManager implements CleanState { * Constructor with dependencies and configuration. * * @param searchFeatureDao DAO of features from search - * @param interpolator interpolator of samples + * @param imputer imputer of samples * @param clock clock for system time * @param maxTrainSamples max number of samples from search * @param maxSampleStride max stride between uninterpolated train samples @@ -94,7 +96,7 @@ public class FeatureManager implements CleanState { */ public FeatureManager( SearchFeatureDao searchFeatureDao, - Interpolator interpolator, + Imputer imputer, Clock clock, int maxTrainSamples, int maxSampleStride, @@ -109,7 +111,7 @@ public FeatureManager( String adThreadPoolName ) { this.searchFeatureDao = searchFeatureDao; - this.interpolator = interpolator; + this.imputer = imputer; this.clock = clock; this.maxTrainSamples = maxTrainSamples; this.maxSampleStride = maxSampleStride; @@ -145,10 +147,10 @@ public void getCurrentFeatures(AnomalyDetector detector, long startTime, long en int shingleSize = detector.getShingleSize(); Deque>> shingle = detectorIdsToTimeShingles - .computeIfAbsent(detector.getDetectorId(), id -> new ArrayDeque<>(shingleSize)); + .computeIfAbsent(detector.getId(), id -> new ArrayDeque<>(shingleSize)); // To allow for small time variations/delays in running the detector. - long maxTimeDifference = detector.getDetectorIntervalInMilliseconds() / 2; + long maxTimeDifference = detector.getIntervalInMilliseconds() / 2; Map>> featuresMap = getNearbyPointsForShingle(detector, shingle, endTime, maxTimeDifference) .collect(Collectors.toMap(Entry::getKey, Entry::getValue)); @@ -156,7 +158,7 @@ public void getCurrentFeatures(AnomalyDetector detector, long startTime, long en if (missingRanges.size() > 0) { try { - searchFeatureDao.getFeatureSamplesForPeriods(detector, missingRanges, ActionListener.wrap(points -> { + searchFeatureDao.getFeatureSamplesForPeriods(detector, missingRanges, AnalysisType.AD, ActionListener.wrap(points -> { for (int i = 0; i < points.size(); i++) { Optional point = points.get(i); long rangeEndTime = missingRanges.get(i).getValue(); @@ -165,7 +167,7 @@ public void getCurrentFeatures(AnomalyDetector detector, long startTime, long en updateUnprocessedFeatures(detector, shingle, featuresMap, endTime, listener); }, listener::onFailure)); } catch (IOException e) { - listener.onFailure(new EndRunException(detector.getDetectorId(), CommonErrorMessages.INVALID_SEARCH_QUERY_MSG, e, true)); + listener.onFailure(new EndRunException(detector.getId(), CommonMessages.INVALID_SEARCH_QUERY_MSG, e, true)); } } else { listener.onResponse(getProcessedFeatures(shingle, detector, endTime)); @@ -177,7 +179,7 @@ private List> getMissingRangesInShingle( Map>> featuresMap, long endTime ) { - long intervalMilli = detector.getDetectorIntervalInMilliseconds(); + long intervalMilli = detector.getIntervalInMilliseconds(); int shingleSize = detector.getShingleSize(); return getFullShingleEndTimes(endTime, intervalMilli, shingleSize) .filter(time -> !featuresMap.containsKey(time)) @@ -212,7 +214,7 @@ private void updateUnprocessedFeatures( ActionListener listener ) { shingle.clear(); - getFullShingleEndTimes(endTime, detector.getDetectorIntervalInMilliseconds(), detector.getShingleSize()) + getFullShingleEndTimes(endTime, detector.getIntervalInMilliseconds(), detector.getShingleSize()) .mapToObj(time -> featuresMap.getOrDefault(time, new SimpleImmutableEntry<>(time, Optional.empty()))) .forEach(e -> shingle.add(e)); @@ -228,7 +230,7 @@ private double[][] filterAndFill(Deque>> shingle, double[][] result = null; if (filteredShingle.size() >= shingleSize - getMaxMissingPoints(shingleSize)) { // Imputes missing data points with the values of neighboring data points. - long maxMillisecondsDifference = maxNeighborDistance * detector.getDetectorIntervalInMilliseconds(); + long maxMillisecondsDifference = maxNeighborDistance * detector.getIntervalInMilliseconds(); result = getNearbyPointsForShingle(detector, filteredShingle, endTime, maxMillisecondsDifference) .map(e -> e.getValue().getValue().orElse(null)) .filter(d -> d != null) @@ -257,7 +259,7 @@ private Stream>>> getNearbyPointsForS long endTime, long maxMillisecondsDifference ) { - long intervalMilli = detector.getDetectorIntervalInMilliseconds(); + long intervalMilli = detector.getIntervalInMilliseconds(); int shingleSize = detector.getShingleSize(); TreeMap> search = new TreeMap<>( shingle.stream().collect(Collectors.toMap(Entry::getKey, Entry::getValue)) @@ -306,10 +308,11 @@ private void getColdStartSamples(Optional latest, AnomalyDetector detector .getFeatureSamplesForPeriods( detector, sampleRanges, + AnalysisType.AD, new ThreadedActionListener<>(logger, threadPool, adThreadPoolName, getFeaturesListener, false) ); } catch (IOException e) { - listener.onFailure(new EndRunException(detector.getDetectorId(), CommonErrorMessages.INVALID_SEARCH_QUERY_MSG, e, true)); + listener.onFailure(new EndRunException(detector.getId(), CommonMessages.INVALID_SEARCH_QUERY_MSG, e, true)); } } else { listener.onResponse(Optional.empty()); @@ -359,7 +362,7 @@ private Optional fillAndShingle(LinkedList> shingle } private List> getColdStartSampleRanges(AnomalyDetector detector, long endMillis) { - long interval = detector.getDetectorIntervalInMilliseconds(); + long interval = detector.getIntervalInMilliseconds(); int numSamples = Math.max((int) (Duration.ofHours(this.trainSampleTimeRangeInHours).toMillis() / interval), this.minTrainSamples); return IntStream .rangeClosed(1, numSamples) @@ -518,7 +521,7 @@ public void getPreviewFeatures(AnomalyDetector detector, long startMilli, long e private Entry>, Integer> getSampleRanges(AnomalyDetector detector, long startMilli, long endMilli) { long start = truncateToMinute(startMilli); long end = truncateToMinute(endMilli); - long bucketSize = detector.getDetectorIntervalInMilliseconds(); + long bucketSize = detector.getIntervalInMilliseconds(); int numBuckets = (int) Math.floor((end - start) / (double) bucketSize); int numSamples = (int) Math.max(Math.min(numBuckets * previewSampleRate, maxPreviewSamples), 1); int stride = (int) Math.max(1, Math.floor((double) numBuckets / numSamples)); @@ -545,7 +548,14 @@ void getPreviewSamplesInRangesForEntity( ActionListener>, double[][]>> listener ) throws IOException { searchFeatureDao - .getColdStartSamplesForPeriods(detector, sampleRanges, entity, true, getSamplesRangesListener(sampleRanges, listener)); + .getColdStartSamplesForPeriods( + detector, + sampleRanges, + Optional.ofNullable(entity), + true, + AnalysisType.AD, + getSamplesRangesListener(sampleRanges, listener) + ); } private ActionListener>> getSamplesRangesListener( @@ -577,7 +587,8 @@ void getSamplesForRanges( List> sampleRanges, ActionListener>, double[][]>> listener ) throws IOException { - searchFeatureDao.getFeatureSamplesForPeriods(detector, sampleRanges, getSamplesRangesListener(sampleRanges, listener)); + searchFeatureDao + .getFeatureSamplesForPeriods(detector, sampleRanges, AnalysisType.AD, getSamplesRangesListener(sampleRanges, listener)); } /** @@ -592,8 +603,8 @@ void getSamplesForRanges( private List> getPreviewRanges(List> ranges, int stride, int shingleSize) { double[] rangeStarts = ranges.stream().mapToDouble(Entry::getKey).toArray(); double[] rangeEnds = ranges.stream().mapToDouble(Entry::getValue).toArray(); - double[] previewRangeStarts = interpolator.interpolate(new double[][] { rangeStarts }, stride * (ranges.size() - 1) + 1)[0]; - double[] previewRangeEnds = interpolator.interpolate(new double[][] { rangeEnds }, stride * (ranges.size() - 1) + 1)[0]; + double[] previewRangeStarts = imputer.impute(new double[][] { rangeStarts }, stride * (ranges.size() - 1) + 1)[0]; + double[] previewRangeEnds = imputer.impute(new double[][] { rangeEnds }, stride * (ranges.size() - 1) + 1)[0]; List> previewRanges = IntStream .range(shingleSize - 1, previewRangeStarts.length) .mapToObj(i -> new SimpleImmutableEntry<>((long) previewRangeStarts[i], (long) previewRangeEnds[i])) @@ -614,7 +625,7 @@ private Entry getPreviewFeatures(double[][] samples, int Entry unprocessedAndProcessed = Optional .of(samples) .map(m -> transpose(m)) - .map(m -> interpolator.interpolate(m, stride * (samples.length - 1) + 1)) + .map(m -> imputer.impute(m, stride * (samples.length - 1) + 1)) .map(m -> transpose(m)) .map(m -> new SimpleImmutableEntry<>(copyOfRange(m, shingleSize - 1, m.length), batchShingle(m, shingleSize))) .get(); @@ -658,7 +669,7 @@ public void getFeatureDataPointsByBatch( listener.onResponse(points); }, listener::onFailure)); } catch (Exception e) { - logger.error("Failed to get features for detector: " + detector.getDetectorId()); + logger.error("Failed to get features for detector: " + detector.getId()); listener.onFailure(e); } } diff --git a/src/main/java/org/opensearch/ad/indices/ADIndex.java b/src/main/java/org/opensearch/ad/indices/ADIndex.java index ea16c38a6..b345ef33e 100644 --- a/src/main/java/org/opensearch/ad/indices/ADIndex.java +++ b/src/main/java/org/opensearch/ad/indices/ADIndex.java @@ -13,43 +13,31 @@ import java.util.function.Supplier; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.util.ThrowingSupplierWrapper; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.function.ThrowingSupplierWrapper; +import org.opensearch.timeseries.indices.TimeSeriesIndex; /** * Represent an AD index * */ -public enum ADIndex { +public enum ADIndex implements TimeSeriesIndex { // throw RuntimeException since we don't know how to handle the case when the mapping reading throws IOException RESULT( - CommonName.ANOMALY_RESULT_INDEX_ALIAS, + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, true, - ThrowingSupplierWrapper.throwingSupplierWrapper(AnomalyDetectionIndices::getAnomalyResultMappings) - ), - CONFIG( - AnomalyDetector.ANOMALY_DETECTORS_INDEX, - false, - ThrowingSupplierWrapper.throwingSupplierWrapper(AnomalyDetectionIndices::getAnomalyDetectorMappings) - ), - JOB( - AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, - false, - ThrowingSupplierWrapper.throwingSupplierWrapper(AnomalyDetectionIndices::getAnomalyDetectorJobMappings) + ThrowingSupplierWrapper.throwingSupplierWrapper(ADIndexManagement::getResultMappings) ), + CONFIG(CommonName.CONFIG_INDEX, false, ThrowingSupplierWrapper.throwingSupplierWrapper(ADIndexManagement::getConfigMappings)), + JOB(CommonName.JOB_INDEX, false, ThrowingSupplierWrapper.throwingSupplierWrapper(ADIndexManagement::getJobMappings)), CHECKPOINT( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, false, - ThrowingSupplierWrapper.throwingSupplierWrapper(AnomalyDetectionIndices::getCheckpointMappings) + ThrowingSupplierWrapper.throwingSupplierWrapper(ADIndexManagement::getCheckpointMappings) ), - STATE( - CommonName.DETECTION_STATE_INDEX, - false, - ThrowingSupplierWrapper.throwingSupplierWrapper(AnomalyDetectionIndices::getDetectionStateMappings) - ); + STATE(ADCommonName.DETECTION_STATE_INDEX, false, ThrowingSupplierWrapper.throwingSupplierWrapper(ADIndexManagement::getStateMappings)); private final String indexName; // whether we use an alias for the index @@ -62,16 +50,24 @@ public enum ADIndex { this.mapping = mappingSupplier.get(); } + @Override public String getIndexName() { return indexName; } + @Override public boolean isAlias() { return alias; } + @Override public String getMapping() { return mapping; } + @Override + public boolean isJobIndex() { + return CommonName.JOB_INDEX.equals(indexName); + } + } diff --git a/src/main/java/org/opensearch/ad/indices/ADIndexManagement.java b/src/main/java/org/opensearch/ad/indices/ADIndexManagement.java new file mode 100644 index 000000000..d0a40ecd8 --- /dev/null +++ b/src/main/java/org/opensearch/ad/indices/ADIndexManagement.java @@ -0,0 +1,275 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.ad.indices; + +import static org.opensearch.ad.constant.ADCommonName.DUMMY_AD_RESULT_ID; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_MAX_PRIMARY_SHARDS; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_RESULT_HISTORY_MAX_DOCS_PER_SHARD; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_RESULT_HISTORY_RETENTION_PERIOD; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_RESULT_HISTORY_ROLLOVER_PERIOD; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.ANOMALY_DETECTION_STATE_INDEX_MAPPING_FILE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.ANOMALY_RESULTS_INDEX_MAPPING_FILE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.CHECKPOINT_INDEX_MAPPING_FILE; + +import java.io.IOException; +import java.util.EnumMap; + +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; +import org.opensearch.action.admin.indices.create.CreateIndexRequest; +import org.opensearch.action.admin.indices.create.CreateIndexResponse; +import org.opensearch.action.delete.DeleteRequest; +import org.opensearch.action.index.IndexRequest; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.ad.model.AnomalyResult; +import org.opensearch.client.Client; +import org.opensearch.cluster.service.ClusterService; +import org.opensearch.common.settings.Settings; +import org.opensearch.common.xcontent.XContentType; +import org.opensearch.core.action.ActionListener; +import org.opensearch.core.xcontent.ToXContent; +import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.indices.IndexManagement; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; + +/** + * This class provides utility methods for various anomaly detection indices. + */ +public class ADIndexManagement extends IndexManagement { + private static final Logger logger = LogManager.getLogger(ADIndexManagement.class); + + // The index name pattern to query all the AD result history indices + public static final String AD_RESULT_HISTORY_INDEX_PATTERN = "<.opendistro-anomaly-results-history-{now/d}-1>"; + + // The index name pattern to query all AD result, history and current AD result + public static final String ALL_AD_RESULTS_INDEX_PATTERN = ".opendistro-anomaly-results*"; + + /** + * Constructor function + * + * @param client OS client supports administrative actions + * @param clusterService OS cluster service + * @param threadPool OS thread pool + * @param settings OS cluster setting + * @param nodeFilter Used to filter eligible nodes to host AD indices + * @param maxUpdateRunningTimes max number of retries to update index mapping and setting + * @throws IOException + */ + public ADIndexManagement( + Client client, + ClusterService clusterService, + ThreadPool threadPool, + Settings settings, + DiscoveryNodeFilterer nodeFilter, + int maxUpdateRunningTimes + ) + throws IOException { + super( + client, + clusterService, + threadPool, + settings, + nodeFilter, + maxUpdateRunningTimes, + ADIndex.class, + AD_MAX_PRIMARY_SHARDS.get(settings), + AD_RESULT_HISTORY_ROLLOVER_PERIOD.get(settings), + AD_RESULT_HISTORY_MAX_DOCS_PER_SHARD.get(settings), + AD_RESULT_HISTORY_RETENTION_PERIOD.get(settings), + ADIndex.RESULT.getMapping() + ); + this.clusterService.addLocalNodeClusterManagerListener(this); + + this.indexStates = new EnumMap(ADIndex.class); + + this.clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_RESULT_HISTORY_MAX_DOCS_PER_SHARD, it -> historyMaxDocs = it); + + this.clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_RESULT_HISTORY_ROLLOVER_PERIOD, it -> { + historyRolloverPeriod = it; + rescheduleRollover(); + }); + this.clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_RESULT_HISTORY_RETENTION_PERIOD, it -> { + historyRetentionPeriod = it; + }); + + this.clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_MAX_PRIMARY_SHARDS, it -> maxPrimaryShards = it); + } + + /** + * Get anomaly result index mapping json content. + * + * @return anomaly result index mapping + * @throws IOException IOException if mapping file can't be read correctly + */ + public static String getResultMappings() throws IOException { + return getMappings(ANOMALY_RESULTS_INDEX_MAPPING_FILE); + } + + /** + * Get anomaly detector state index mapping json content. + * + * @return anomaly detector state index mapping + * @throws IOException IOException if mapping file can't be read correctly + */ + public static String getStateMappings() throws IOException { + String detectionStateMappings = getMappings(ANOMALY_DETECTION_STATE_INDEX_MAPPING_FILE); + String detectorIndexMappings = getConfigMappings(); + detectorIndexMappings = detectorIndexMappings + .substring(detectorIndexMappings.indexOf("\"properties\""), detectorIndexMappings.lastIndexOf("}")); + return detectionStateMappings.replace("DETECTOR_INDEX_MAPPING_PLACE_HOLDER", detectorIndexMappings); + } + + /** + * Get checkpoint index mapping json content. + * + * @return checkpoint index mapping + * @throws IOException IOException if mapping file can't be read correctly + */ + public static String getCheckpointMappings() throws IOException { + return getMappings(CHECKPOINT_INDEX_MAPPING_FILE); + } + + /** + * anomaly result index exist or not. + * + * @return true if anomaly result index exists + */ + @Override + public boolean doesDefaultResultIndexExist() { + return doesAliasExist(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS); + } + + /** + * Anomaly state index exist or not. + * + * @return true if anomaly state index exists + */ + @Override + public boolean doesStateIndexExist() { + return doesIndexExist(ADCommonName.DETECTION_STATE_INDEX); + } + + /** + * Checkpoint index exist or not. + * + * @return true if checkpoint index exists + */ + @Override + public boolean doesCheckpointIndexExist() { + return doesIndexExist(ADCommonName.CHECKPOINT_INDEX_NAME); + } + + /** + * Create anomaly result index without checking exist or not. + * + * @param actionListener action called after create index + */ + @Override + public void initDefaultResultIndexDirectly(ActionListener actionListener) { + initResultIndexDirectly( + AD_RESULT_HISTORY_INDEX_PATTERN, + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, + true, + AD_RESULT_HISTORY_INDEX_PATTERN, + ADIndex.RESULT, + actionListener + ); + } + + /** + * Create the state index. + * + * @param actionListener action called after create index + */ + @Override + public void initStateIndex(ActionListener actionListener) { + try { + CreateIndexRequest request = new CreateIndexRequest(ADCommonName.DETECTION_STATE_INDEX) + .mapping(getStateMappings(), XContentType.JSON) + .settings(settings); + adminClient.indices().create(request, markMappingUpToDate(ADIndex.STATE, actionListener)); + } catch (IOException e) { + logger.error("Fail to init AD detection state index", e); + actionListener.onFailure(e); + } + } + + /** + * Create the checkpoint index. + * + * @param actionListener action called after create index + * @throws EndRunException EndRunException due to failure to get mapping + */ + @Override + public void initCheckpointIndex(ActionListener actionListener) { + String mapping; + try { + mapping = getCheckpointMappings(); + } catch (IOException e) { + throw new EndRunException("", "Cannot find checkpoint mapping file", true); + } + CreateIndexRequest request = new CreateIndexRequest(ADCommonName.CHECKPOINT_INDEX_NAME).mapping(mapping, XContentType.JSON); + choosePrimaryShards(request, true); + adminClient.indices().create(request, markMappingUpToDate(ADIndex.CHECKPOINT, actionListener)); + } + + @Override + protected void rolloverAndDeleteHistoryIndex() { + rolloverAndDeleteHistoryIndex( + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, + ALL_AD_RESULTS_INDEX_PATTERN, + AD_RESULT_HISTORY_INDEX_PATTERN, + ADIndex.RESULT + ); + } + + /** + * Create config index directly. + * + * @param actionListener action called after create index + * @throws IOException IOException from {@link IndexManagement#getConfigMappings} + */ + @Override + public void initConfigIndex(ActionListener actionListener) throws IOException { + super.initConfigIndex(markMappingUpToDate(ADIndex.CONFIG, actionListener)); + } + + /** + * Create job index. + * + * @param actionListener action called after create index + */ + @Override + public void initJobIndex(ActionListener actionListener) { + super.initJobIndex(markMappingUpToDate(ADIndex.JOB, actionListener)); + } + + @Override + protected IndexRequest createDummyIndexRequest(String resultIndex) throws IOException { + AnomalyResult dummyResult = AnomalyResult.getDummyResult(); + return new IndexRequest(resultIndex) + .id(DUMMY_AD_RESULT_ID) + .source(dummyResult.toXContent(XContentBuilder.builder(XContentType.JSON.xContent()), ToXContent.EMPTY_PARAMS)); + } + + @Override + protected DeleteRequest createDummyDeleteRequest(String resultIndex) throws IOException { + return new DeleteRequest(resultIndex).id(DUMMY_AD_RESULT_ID); + } + + @Override + public void initCustomResultIndexDirectly(String resultIndex, ActionListener actionListener) { + initResultIndexDirectly(resultIndex, null, false, AD_RESULT_HISTORY_INDEX_PATTERN, ADIndex.RESULT, actionListener); + } +} diff --git a/src/main/java/org/opensearch/ad/ml/CheckpointDao.java b/src/main/java/org/opensearch/ad/ml/CheckpointDao.java index 78c406032..adb097cb6 100644 --- a/src/main/java/org/opensearch/ad/ml/CheckpointDao.java +++ b/src/main/java/org/opensearch/ad/ml/CheckpointDao.java @@ -48,18 +48,12 @@ import org.opensearch.action.get.MultiGetAction; import org.opensearch.action.get.MultiGetRequest; import org.opensearch.action.get.MultiGetResponse; -import org.opensearch.action.index.IndexRequest; -import org.opensearch.action.index.IndexResponse; import org.opensearch.action.support.IndicesOptions; import org.opensearch.action.update.UpdateRequest; import org.opensearch.action.update.UpdateResponse; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.indices.ADIndex; -import org.opensearch.ad.indices.AnomalyDetectionIndices; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.util.ClientUtil; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.client.Client; import org.opensearch.core.action.ActionListener; import org.opensearch.index.IndexNotFoundException; @@ -68,6 +62,12 @@ import org.opensearch.index.reindex.DeleteByQueryAction; import org.opensearch.index.reindex.DeleteByQueryRequest; import org.opensearch.index.reindex.ScrollableHitSource; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.ml.SingleStreamModelIdMapper; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.util.ClientUtil; import com.amazon.randomcutforest.RandomCutForest; import com.amazon.randomcutforest.config.Precision; @@ -98,16 +98,10 @@ public class CheckpointDao { static final String INDEX_DELETED_LOG_MSG = "Checkpoint index has been deleted. Has nothing to do:"; static final String NOT_ABLE_TO_DELETE_LOG_MSG = "Cannot delete all checkpoints of detector"; - // ====================================== - // Model serialization/deserialization - // ====================================== - public static final String ENTITY_SAMPLE = "sp"; public static final String ENTITY_RCF = "rcf"; public static final String ENTITY_THRESHOLD = "th"; public static final String ENTITY_TRCF = "trcf"; - public static final String FIELD_MODEL = "model"; public static final String FIELD_MODELV2 = "modelV2"; - public static final String TIMESTAMP = "timestamp"; public static final String DETECTOR_ID = "detectorId"; // dependencies @@ -133,7 +127,7 @@ public class CheckpointDao { private final Class thresholdingModelClass; - private final AnomalyDetectionIndices indexUtil; + private final ADIndexManagement indexUtil; private final JsonParser parser = new JsonParser(); // we won't read/write a checkpoint larger than a threshold private final int maxCheckpointBytes; @@ -171,7 +165,7 @@ public CheckpointDao( ThresholdedRandomCutForestMapper trcfMapper, Schema trcfSchema, Class thresholdingModelClass, - AnomalyDetectionIndices indexUtil, + ADIndexManagement indexUtil, int maxCheckpointBytes, GenericObjectPool serializeRCFBufferPool, int serializeRCFBufferSize, @@ -193,15 +187,11 @@ public CheckpointDao( this.anomalyRate = anomalyRate; } - private void saveModelCheckpointSync(Map source, String modelId) { - clientUtil.timedRequest(new IndexRequest(indexName).id(modelId).source(source), logger, client::index); - } - private void putModelCheckpoint(String modelId, Map source, ActionListener listener) { if (indexUtil.doesCheckpointIndexExist()) { saveModelCheckpointAsync(source, modelId, listener); } else { - onCheckpointNotExist(source, modelId, true, listener); + onCheckpointNotExist(source, modelId, listener); } } @@ -217,7 +207,7 @@ public void putTRCFCheckpoint(String modelId, ThresholdedRandomCutForest forest, String modelCheckpoint = toCheckpoint(forest); if (modelCheckpoint != null) { source.put(FIELD_MODELV2, modelCheckpoint); - source.put(TIMESTAMP, ZonedDateTime.now(ZoneOffset.UTC)); + source.put(CommonName.TIMESTAMP, ZonedDateTime.now(ZoneOffset.UTC)); putModelCheckpoint(modelId, source, listener); } else { listener.onFailure(new RuntimeException("Fail to create checkpoint to save")); @@ -234,30 +224,22 @@ public void putTRCFCheckpoint(String modelId, ThresholdedRandomCutForest forest, public void putThresholdCheckpoint(String modelId, ThresholdingModel threshold, ActionListener listener) { String modelCheckpoint = AccessController.doPrivileged((PrivilegedAction) () -> gson.toJson(threshold)); Map source = new HashMap<>(); - source.put(FIELD_MODEL, modelCheckpoint); - source.put(TIMESTAMP, ZonedDateTime.now(ZoneOffset.UTC)); + source.put(CommonName.FIELD_MODEL, modelCheckpoint); + source.put(CommonName.TIMESTAMP, ZonedDateTime.now(ZoneOffset.UTC)); putModelCheckpoint(modelId, source, listener); } - private void onCheckpointNotExist(Map source, String modelId, boolean isAsync, ActionListener listener) { + private void onCheckpointNotExist(Map source, String modelId, ActionListener listener) { indexUtil.initCheckpointIndex(ActionListener.wrap(initResponse -> { if (initResponse.isAcknowledged()) { - if (isAsync) { - saveModelCheckpointAsync(source, modelId, listener); - } else { - saveModelCheckpointSync(source, modelId); - } + saveModelCheckpointAsync(source, modelId, listener); } else { throw new RuntimeException("Creating checkpoint with mappings call not acknowledged."); } }, exception -> { if (ExceptionsHelper.unwrapCause(exception) instanceof ResourceAlreadyExistsException) { // It is possible the index has been created while we sending the create request - if (isAsync) { - saveModelCheckpointAsync(source, modelId, listener); - } else { - saveModelCheckpointSync(source, modelId); - } + saveModelCheckpointAsync(source, modelId, listener); } else { logger.error(String.format(Locale.ROOT, "Unexpected error creating index %s", indexName), exception); } @@ -310,11 +292,11 @@ public Map toIndexSource(ModelState modelState) thr ); return source; } - String detectorId = modelState.getDetectorId(); + String detectorId = modelState.getId(); source.put(DETECTOR_ID, detectorId); // we cannot pass Optional as OpenSearch does not know how to serialize an Optional value source.put(FIELD_MODELV2, serializedModel.get()); - source.put(TIMESTAMP, ZonedDateTime.now(ZoneOffset.UTC)); + source.put(CommonName.TIMESTAMP, ZonedDateTime.now(ZoneOffset.UTC)); source.put(CommonName.SCHEMA_VERSION_FIELD, indexUtil.getSchemaVersion(ADIndex.CHECKPOINT)); Optional entity = model.getEntity(); if (entity.isPresent()) { @@ -339,7 +321,7 @@ public Optional toCheckpoint(EntityModel model, String modelId) { try { JsonObject json = new JsonObject(); if (model.getSamples() != null && !(model.getSamples().isEmpty())) { - json.add(ENTITY_SAMPLE, gson.toJsonTree(model.getSamples())); + json.add(CommonName.ENTITY_SAMPLE, gson.toJsonTree(model.getSamples())); } if (model.getTrcf().isPresent()) { json.addProperty(ENTITY_TRCF, toCheckpoint(model.getTrcf().get())); @@ -439,7 +421,7 @@ public void deleteModelCheckpointByDetectorId(String detectorID) { // with exponential back off. If the maximum retry limit is reached, processing // halts and all failed requests are returned in the response. Any delete // requests that completed successfully still stick, they are not rolled back. - DeleteByQueryRequest deleteRequest = new DeleteByQueryRequest(CommonName.CHECKPOINT_INDEX_NAME) + DeleteByQueryRequest deleteRequest = new DeleteByQueryRequest(ADCommonName.CHECKPOINT_INDEX_NAME) .setQuery(new MatchQueryBuilder(DETECTOR_ID, detectorID)) .setIndicesOptions(IndicesOptions.LENIENT_EXPAND_OPEN) .setAbortOnVersionConflict(false) // when current delete happens, previous might not finish. @@ -495,7 +477,7 @@ public Optional> fromEntityModelCheckpoint(Map> fromEntityModelCheckpoint(Map samples = null; - if (json.has(ENTITY_SAMPLE)) { + if (json.has(CommonName.ENTITY_SAMPLE)) { // verified, don't need privileged call to get permission samples = new ArrayDeque<>( - Arrays.asList(this.gson.fromJson(json.getAsJsonArray(ENTITY_SAMPLE), new double[0][0].getClass())) + Arrays.asList(this.gson.fromJson(json.getAsJsonArray(CommonName.ENTITY_SAMPLE), new double[0][0].getClass())) ); } else { // avoid possible null pointer exception @@ -545,7 +527,7 @@ public Optional> fromEntityModelCheckpoint(Map forest = deserializeRCFModel((String) modelV1, rcfModelId); if (!forest.isPresent()) { logger.error("Unexpected error when deserializing [{}]", rcfModelId); @@ -718,7 +700,7 @@ private Optional processThresholdModelCheckpoint(GetResponse response) { .ofNullable(response) .filter(GetResponse::isExists) .map(GetResponse::getSource) - .map(source -> source.get(FIELD_MODEL)); + .map(source -> source.get(CommonName.FIELD_MODEL)); } private Optional> processRawCheckpoint(GetResponse response) { @@ -738,7 +720,7 @@ public void batchWrite(BulkRequest request, ActionListener listene clientUtil.execute(BulkAction.INSTANCE, request, listener); } else { // create index failure. Notify callers using listener. - listener.onFailure(new AnomalyDetectionException("Creating checkpoint with mappings call not acknowledged.")); + listener.onFailure(new TimeSeriesException("Creating checkpoint with mappings call not acknowledged.")); } }, exception -> { if (ExceptionsHelper.unwrapCause(exception) instanceof ResourceAlreadyExistsException) { diff --git a/src/main/java/org/opensearch/ad/ml/EntityColdStarter.java b/src/main/java/org/opensearch/ad/ml/EntityColdStarter.java index 67b750833..1044b84ce 100644 --- a/src/main/java/org/opensearch/ad/ml/EntityColdStarter.java +++ b/src/main/java/org/opensearch/ad/ml/EntityColdStarter.java @@ -11,7 +11,7 @@ package org.opensearch.ad.ml; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.COOLDOWN_MINUTES; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_COOLDOWN_MINUTES; import java.time.Clock; import java.time.Duration; @@ -36,27 +36,29 @@ import org.apache.logging.log4j.core.util.Throwables; import org.apache.logging.log4j.message.ParameterizedMessage; import org.opensearch.action.support.ThreadedActionListener; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.CleanState; -import org.opensearch.ad.MaintenanceState; -import org.opensearch.ad.NodeStateManager; import org.opensearch.ad.caching.DoorKeeper; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.dataprocessor.Interpolator; import org.opensearch.ad.feature.FeatureManager; -import org.opensearch.ad.feature.SearchFeatureDao; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.ad.ratelimit.CheckpointWriteWorker; import org.opensearch.ad.ratelimit.RequestPriority; -import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.settings.EnabledSetting; -import org.opensearch.ad.util.ExceptionUtil; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.CleanState; +import org.opensearch.timeseries.MaintenanceState; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.dataprocessor.Imputer; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.ExceptionUtil; import com.amazon.randomcutforest.config.Precision; import com.amazon.randomcutforest.config.TransformMethod; @@ -78,7 +80,7 @@ public class EntityColdStarter implements MaintenanceState, CleanState { private final double thresholdMinPvalue; private final int defaulStrideLength; private final int defaultNumberOfSamples; - private final Interpolator interpolator; + private final Imputer imputer; private final SearchFeatureDao searchFeatureDao; private Instant lastThrottledColdStartTime; private final FeatureManager featureManager; @@ -109,7 +111,7 @@ public class EntityColdStarter implements MaintenanceState, CleanState { * results are returned. * @param defaultSampleStride default sample distances measured in detector intervals. * @param defaultTrainSamples Default train samples to collect. - * @param interpolator Used to generate data points between samples. + * @param imputer Used to generate data points between samples. * @param searchFeatureDao Used to issue ES queries. * @param thresholdMinPvalue min P-value for thresholding * @param featureManager Used to create features for models. @@ -131,7 +133,7 @@ public EntityColdStarter( int numMinSamples, int defaultSampleStride, int defaultTrainSamples, - Interpolator interpolator, + Imputer imputer, SearchFeatureDao searchFeatureDao, double thresholdMinPvalue, FeatureManager featureManager, @@ -151,11 +153,11 @@ public EntityColdStarter( this.numMinSamples = numMinSamples; this.defaulStrideLength = defaultSampleStride; this.defaultNumberOfSamples = defaultTrainSamples; - this.interpolator = interpolator; + this.imputer = imputer; this.searchFeatureDao = searchFeatureDao; this.thresholdMinPvalue = thresholdMinPvalue; this.featureManager = featureManager; - this.coolDownMinutes = (int) (COOLDOWN_MINUTES.get(settings).getMinutes()); + this.coolDownMinutes = (int) (AD_COOLDOWN_MINUTES.get(settings).getMinutes()); this.doorKeepers = new ConcurrentHashMap<>(); this.modelTtl = modelTtl; this.checkpointWriteQueue = checkpointWriteQueue; @@ -174,7 +176,7 @@ public EntityColdStarter( int numMinSamples, int maxSampleStride, int maxTrainSamples, - Interpolator interpolator, + Imputer imputer, SearchFeatureDao searchFeatureDao, double thresholdMinPvalue, FeatureManager featureManager, @@ -193,7 +195,7 @@ public EntityColdStarter( numMinSamples, maxSampleStride, maxTrainSamples, - interpolator, + imputer, searchFeatureDao, thresholdMinPvalue, featureManager, @@ -249,9 +251,9 @@ private void coldStart( DoorKeeper doorKeeper = doorKeepers.computeIfAbsent(detectorId, id -> { // reset every 60 intervals return new DoorKeeper( - AnomalyDetectorSettings.DOOR_KEEPER_FOR_COLD_STARTER_MAX_INSERTION, - AnomalyDetectorSettings.DOOR_KEEPER_FAULSE_POSITIVE_RATE, - detector.getDetectionIntervalDuration().multipliedBy(AnomalyDetectorSettings.DOOR_KEEPER_MAINTENANCE_FREQ), + TimeSeriesSettings.DOOR_KEEPER_FOR_COLD_STARTER_MAX_INSERTION, + TimeSeriesSettings.DOOR_KEEPER_FALSE_POSITIVE_RATE, + detector.getIntervalDuration().multipliedBy(TimeSeriesSettings.DOOR_KEEPER_MAINTENANCE_FREQ), clock ); }); @@ -294,11 +296,11 @@ private void coldStart( if (ExceptionUtil.isOverloaded(cause)) { logger.error("too many requests"); lastThrottledColdStartTime = Instant.now(); - } else if (cause instanceof AnomalyDetectionException || exception instanceof AnomalyDetectionException) { + } else if (cause instanceof TimeSeriesException || exception instanceof TimeSeriesException) { // e.g., cannot find anomaly detector nodeStateManager.setException(detectorId, exception); } else { - nodeStateManager.setException(detectorId, new AnomalyDetectionException(detectorId, cause)); + nodeStateManager.setException(detectorId, new TimeSeriesException(detectorId, cause)); } listener.onFailure(exception); } catch (Exception e) { @@ -307,7 +309,7 @@ private void coldStart( }); threadPool - .executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME) + .executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME) .execute( () -> getEntityColdStartData( detectorId, @@ -315,7 +317,7 @@ private void coldStart( new ThreadedActionListener<>( logger, threadPool, - AnomalyDetectorPlugin.AD_THREAD_POOL_NAME, + TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME, coldStartCallBack, false ) @@ -362,7 +364,7 @@ private void trainModelFromDataSegments( .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) // same with dimension for opportunistic memory saving // Usually, we use it as shingleSize(dimension). When a new point comes in, we will // look at the point store if there is any overlapping. Say the previously-stored @@ -408,13 +410,13 @@ private void trainModelFromDataSegments( * @param listener listener to return training data */ private void getEntityColdStartData(String detectorId, Entity entity, ActionListener>> listener) { - ActionListener> getDetectorListener = ActionListener.wrap(detectorOp -> { + ActionListener> getDetectorListener = ActionListener.wrap(detectorOp -> { if (!detectorOp.isPresent()) { listener.onFailure(new EndRunException(detectorId, "AnomalyDetector is not available.", false)); return; } List coldStartData = new ArrayList<>(); - AnomalyDetector detector = detectorOp.get(); + AnomalyDetector detector = (AnomalyDetector) detectorOp.get(); ActionListener> minTimeListener = ActionListener.wrap(earliest -> { if (earliest.isPresent()) { @@ -439,18 +441,20 @@ private void getEntityColdStartData(String detectorId, Entity entity, ActionList }, listener::onFailure); searchFeatureDao - .getEntityMinDataTime( + .getMinDataTime( detector, - entity, - new ThreadedActionListener<>(logger, threadPool, AnomalyDetectorPlugin.AD_THREAD_POOL_NAME, minTimeListener, false) + Optional.ofNullable(entity), + AnalysisType.AD, + new ThreadedActionListener<>(logger, threadPool, TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME, minTimeListener, false) ); }, listener::onFailure); nodeStateManager - .getAnomalyDetector( + .getConfig( detectorId, - new ThreadedActionListener<>(logger, threadPool, AnomalyDetectorPlugin.AD_THREAD_POOL_NAME, getDetectorListener, false) + AnalysisType.AD, + new ThreadedActionListener<>(logger, threadPool, TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME, getDetectorListener, false) ); } @@ -465,7 +469,7 @@ private void getFeatures( long startTimeMs, long endTimeMs ) { - if (startTimeMs >= endTimeMs || endTimeMs - startTimeMs < detector.getDetectorIntervalInMilliseconds()) { + if (startTimeMs >= endTimeMs || endTimeMs - startTimeMs < detector.getIntervalInMilliseconds()) { listener.onResponse(Optional.of(lastRoundColdStartData)); return; } @@ -499,8 +503,8 @@ private void getFeatures( int numInterpolants = (i - lastSample.getLeft()) * stride + 1; double[][] points = featureManager .transpose( - interpolator - .interpolate( + imputer + .impute( featureManager.transpose(new double[][] { lastSample.getRight(), featuresOptional.get() }), numInterpolants ) @@ -550,14 +554,21 @@ private void getFeatures( .getColdStartSamplesForPeriods( detector, sampleRanges, - entity, + Optional.ofNullable(entity), // Accept empty bucket. // 0, as returned by the engine should constitute a valid answer, “null” is a missing answer — it may be that 0 // is meaningless in some case, but 0 is also meaningful in some cases. It may be that the query defining the // metric is ill-formed, but that cannot be solved by cold-start strategy of the AD plugin — if we attempt to do // that, we will have issues with legitimate interpretations of 0. true, - new ThreadedActionListener<>(logger, threadPool, AnomalyDetectorPlugin.AD_THREAD_POOL_NAME, getFeaturelistener, false) + AnalysisType.AD, + new ThreadedActionListener<>( + logger, + threadPool, + TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME, + getFeaturelistener, + false + ) ); } catch (Exception e) { listener.onFailure(e); @@ -594,8 +605,8 @@ private int calculateColdStartDataSize(List coldStartData) { */ private Pair selectRangeParam(AnomalyDetector detector) { int shingleSize = detector.getShingleSize(); - if (EnabledSetting.isInterpolationInColdStartEnabled()) { - long delta = detector.getDetectorIntervalInMinutes(); + if (ADEnabledSetting.isInterpolationInColdStartEnabled()) { + long delta = detector.getIntervalInMinutes(); int strideLength = defaulStrideLength; int numberOfSamples = defaultNumberOfSamples; @@ -630,7 +641,7 @@ private List> getTrainSampleRanges( int stride, int numberOfSamples ) { - long bucketSize = ((IntervalTimeConfiguration) detector.getDetectionInterval()).toDuration().toMillis(); + long bucketSize = ((IntervalTimeConfiguration) detector.getInterval()).toDuration().toMillis(); int numBuckets = (int) Math.floor((endMilli - startMilli) / (double) bucketSize); // adjust if numStrides is more than the max samples int numStrides = Math.min((int) Math.floor(numBuckets / (double) stride), numberOfSamples); @@ -652,14 +663,14 @@ private List> getTrainSampleRanges( * cold start queue to pull another request (if any) to execute. */ public void trainModel(Entity entity, String detectorId, ModelState modelState, ActionListener listener) { - nodeStateManager.getAnomalyDetector(detectorId, ActionListener.wrap(detectorOptional -> { + nodeStateManager.getConfig(detectorId, AnalysisType.AD, ActionListener.wrap(detectorOptional -> { if (false == detectorOptional.isPresent()) { logger.warn(new ParameterizedMessage("AnomalyDetector [{}] is not available.", detectorId)); - listener.onFailure(new AnomalyDetectionException(detectorId, "fail to find detector")); + listener.onFailure(new TimeSeriesException(detectorId, "fail to find detector")); return; } - AnomalyDetector detector = detectorOptional.get(); + AnomalyDetector detector = (AnomalyDetector) detectorOptional.get(); Queue samples = modelState.getModel().getSamples(); String modelId = modelState.getModelId(); diff --git a/src/main/java/org/opensearch/ad/ml/EntityModel.java b/src/main/java/org/opensearch/ad/ml/EntityModel.java index 5e6aeddd3..348ad8c6e 100644 --- a/src/main/java/org/opensearch/ad/ml/EntityModel.java +++ b/src/main/java/org/opensearch/ad/ml/EntityModel.java @@ -15,7 +15,7 @@ import java.util.Optional; import java.util.Queue; -import org.opensearch.ad.model.Entity; +import org.opensearch.timeseries.model.Entity; import com.amazon.randomcutforest.parkservices.ThresholdedRandomCutForest; diff --git a/src/main/java/org/opensearch/ad/ml/ModelManager.java b/src/main/java/org/opensearch/ad/ml/ModelManager.java index 24349eb66..14f935aae 100644 --- a/src/main/java/org/opensearch/ad/ml/ModelManager.java +++ b/src/main/java/org/opensearch/ad/ml/ModelManager.java @@ -33,19 +33,20 @@ import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.message.ParameterizedMessage; import org.opensearch.ad.DetectorModelSize; -import org.opensearch.ad.MemoryTracker; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.feature.FeatureManager; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.util.DateUtils; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.common.unit.TimeValue; import org.opensearch.core.action.ActionListener; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.ml.SingleStreamModelIdMapper; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import com.amazon.randomcutforest.RandomCutForest; import com.amazon.randomcutforest.config.Precision; @@ -300,7 +301,7 @@ private void processRestoredTRcf( forests.put(modelId, model.get()); getTRcfResult(model.get(), point, listener); } else { - throw new ResourceNotFoundException(detectorId, CommonErrorMessages.NO_CHECKPOINT_ERR_MSG + modelId); + throw new ResourceNotFoundException(detectorId, ADCommonMessages.NO_CHECKPOINT_ERR_MSG + modelId); } } @@ -324,7 +325,7 @@ private void processRestoredCheckpoint( if (model.get().getModel() != null && model.get().getModel().getForest() != null) listener.onResponse(model.get().getModel().getForest().getTotalUpdates()); } else { - listener.onFailure(new ResourceNotFoundException(detectorId, CommonErrorMessages.NO_CHECKPOINT_ERR_MSG + modelId)); + listener.onFailure(new ResourceNotFoundException(detectorId, ADCommonMessages.NO_CHECKPOINT_ERR_MSG + modelId)); } } @@ -380,7 +381,7 @@ private void processThresholdCheckpoint( thresholds.put(modelId, model.get()); getThresholdingResult(model.get(), score, listener); } else { - throw new ResourceNotFoundException(detectorId, CommonErrorMessages.NO_CHECKPOINT_ERR_MSG + modelId); + throw new ResourceNotFoundException(detectorId, ADCommonMessages.NO_CHECKPOINT_ERR_MSG + modelId); } } @@ -528,7 +529,7 @@ private void trainModelForStep( .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) .shingleSize(detector.getShingleSize()) .anomalyRate(1 - thresholdMinPvalue) .transformMethod(TransformMethod.NORMALIZE) @@ -538,7 +539,7 @@ private void trainModelForStep( .build(); Arrays.stream(dataPoints).forEach(s -> trcf.process(s, 0)); - String modelId = SingleStreamModelIdMapper.getRcfModelId(detector.getDetectorId(), step); + String modelId = SingleStreamModelIdMapper.getRcfModelId(detector.getId(), step); checkpointDao.putTRCFCheckpoint(modelId, trcf, ActionListener.wrap(r -> listener.onResponse(null), listener::onFailure)); } @@ -622,7 +623,7 @@ public List getPreviewResults(double[][] dataPoints, int shi .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.BATCH_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.BATCH_BOUNDING_BOX_CACHE_RATIO) .shingleSize(shingleSize) .anomalyRate(1 - this.thresholdMinPvalue) .transformMethod(TransformMethod.NORMALIZE) diff --git a/src/main/java/org/opensearch/ad/ml/ModelState.java b/src/main/java/org/opensearch/ad/ml/ModelState.java index 430e06bd1..bb9050ecb 100644 --- a/src/main/java/org/opensearch/ad/ml/ModelState.java +++ b/src/main/java/org/opensearch/ad/ml/ModelState.java @@ -17,8 +17,9 @@ import java.util.HashMap; import java.util.Map; -import org.opensearch.ad.ExpiringState; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.timeseries.ExpiringState; +import org.opensearch.timeseries.constant.CommonName; /** * A ML model and states such as usage. @@ -110,7 +111,7 @@ public String getModelId() { * * @return detectorId associated with the model */ - public String getDetectorId() { + public String getId() { return detectorId; } @@ -179,8 +180,8 @@ public void setPriority(float priority) { public Map getModelStateAsMap() { return new HashMap() { { - put(CommonName.MODEL_ID_KEY, modelId); - put(CommonName.DETECTOR_ID_KEY, detectorId); + put(CommonName.MODEL_ID_FIELD, modelId); + put(ADCommonName.DETECTOR_ID_KEY, detectorId); put(MODEL_TYPE_KEY, modelType); /* A stats API broadcasts requests to all nodes and renders node responses using toXContent. * diff --git a/src/main/java/org/opensearch/ad/ml/TRCFMemoryAwareConcurrentHashmap.java b/src/main/java/org/opensearch/ad/ml/TRCFMemoryAwareConcurrentHashmap.java index 7b7b1fe7d..2380173b0 100644 --- a/src/main/java/org/opensearch/ad/ml/TRCFMemoryAwareConcurrentHashmap.java +++ b/src/main/java/org/opensearch/ad/ml/TRCFMemoryAwareConcurrentHashmap.java @@ -13,8 +13,8 @@ import java.util.concurrent.ConcurrentHashMap; -import org.opensearch.ad.MemoryTracker; -import org.opensearch.ad.MemoryTracker.Origin; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.MemoryTracker.Origin; import com.amazon.randomcutforest.parkservices.ThresholdedRandomCutForest; @@ -37,7 +37,7 @@ public ModelState remove(Object key) { ModelState deletedModelState = super.remove(key); if (deletedModelState != null && deletedModelState.getModel() != null) { long memoryToRelease = memoryTracker.estimateTRCFModelSize(deletedModelState.getModel()); - memoryTracker.releaseMemory(memoryToRelease, true, Origin.SINGLE_ENTITY_DETECTOR); + memoryTracker.releaseMemory(memoryToRelease, true, Origin.REAL_TIME_DETECTOR); } return deletedModelState; } @@ -47,7 +47,7 @@ public ModelState put(K key, ModelState previousAssociatedState = super.put(key, value); if (value != null && value.getModel() != null) { long memoryToConsume = memoryTracker.estimateTRCFModelSize(value.getModel()); - memoryTracker.consumeMemory(memoryToConsume, true, Origin.SINGLE_ENTITY_DETECTOR); + memoryTracker.consumeMemory(memoryToConsume, true, Origin.REAL_TIME_DETECTOR); } return previousAssociatedState; } diff --git a/src/main/java/org/opensearch/ad/ml/ThresholdingResult.java b/src/main/java/org/opensearch/ad/ml/ThresholdingResult.java index 4d8f72876..a2da03f51 100644 --- a/src/main/java/org/opensearch/ad/ml/ThresholdingResult.java +++ b/src/main/java/org/opensearch/ad/ml/ThresholdingResult.java @@ -13,25 +13,24 @@ import java.time.Instant; import java.util.Arrays; +import java.util.Collections; import java.util.List; import java.util.Objects; +import java.util.Optional; import org.apache.commons.lang.builder.ToStringBuilder; -import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.FeatureData; +import org.opensearch.timeseries.ml.IntermediateResult; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.FeatureData; /** * Data object containing thresholding results. */ -public class ThresholdingResult { +public class ThresholdingResult extends IntermediateResult { private final double grade; - private final double confidence; - private final double rcfScore; - private long totalUpdates; - /** * position of the anomaly vis a vis the current time (can be -ve) if anomaly is * detected late, which can and should happen sometime; for shingle size 1; this @@ -135,6 +134,8 @@ public class ThresholdingResult { // size of the forest private int forestSize; + protected final double confidence; + /** * Constructor for default empty value or backward compatibility. * In terms of bwc, when an old node sends request for threshold results, @@ -163,10 +164,10 @@ public ThresholdingResult( double threshold, int forestSize ) { - this.grade = grade; + super(totalUpdates, rcfScore); this.confidence = confidence; - this.rcfScore = rcfScore; - this.totalUpdates = totalUpdates; + this.grade = grade; + this.relativeIndex = relativeIndex; this.relevantAttribution = relevantAttribution; this.pastValues = pastValues; @@ -177,29 +178,21 @@ public ThresholdingResult( } /** - * Returns the anomaly grade. + * Returns the confidence for the result (e.g., anomaly grade in AD). * - * @return the anoamly grade + * @return confidence for the result */ - public double getGrade() { - return grade; + public double getConfidence() { + return confidence; } /** - * Returns the confidence for the grade. + * Returns the anomaly grade. * - * @return confidence for the grade + * @return the anoamly grade */ - public double getConfidence() { - return confidence; - } - - public double getRcfScore() { - return rcfScore; - } - - public long getTotalUpdates() { - return totalUpdates; + public double getGrade() { + return grade; } public int getRelativeIndex() { @@ -232,21 +225,19 @@ public int getForestSize() { @Override public boolean equals(Object o) { - if (this == o) - return true; - if (o == null || getClass() != o.getClass()) + if (!super.equals(o)) + return false; + if (getClass() != o.getClass()) return false; ThresholdingResult that = (ThresholdingResult) o; - return this.grade == that.grade - && this.confidence == that.confidence - && this.rcfScore == that.rcfScore - && this.totalUpdates == that.totalUpdates + return Double.doubleToLongBits(confidence) == Double.doubleToLongBits(that.confidence) + && Double.doubleToLongBits(this.grade) == Double.doubleToLongBits(that.grade) && this.relativeIndex == that.relativeIndex && Arrays.equals(relevantAttribution, that.relevantAttribution) && Arrays.equals(pastValues, that.pastValues) && Arrays.deepEquals(expectedValuesList, that.expectedValuesList) && Arrays.equals(likelihoodOfValues, that.likelihoodOfValues) - && threshold == that.threshold + && Double.doubleToLongBits(threshold) == Double.doubleToLongBits(that.threshold) && forestSize == that.forestSize; } @@ -254,10 +245,9 @@ public boolean equals(Object o) { public int hashCode() { return Objects .hash( - grade, + super.hashCode(), confidence, - rcfScore, - totalUpdates, + grade, relativeIndex, Arrays.hashCode(relevantAttribution), Arrays.hashCode(pastValues), @@ -271,10 +261,9 @@ public int hashCode() { @Override public String toString() { return new ToStringBuilder(this) + .append(super.toString()) .append("grade", grade) .append("confidence", confidence) - .append("rcfScore", rcfScore) - .append("totalUpdates", totalUpdates) .append("relativeIndex", relativeIndex) .append("relevantAttribution", Arrays.toString(relevantAttribution)) .append("pastValues", Arrays.toString(pastValues)) @@ -302,43 +291,47 @@ public String toString() { * @param error Error * @return converted AnomalyResult */ - public AnomalyResult toAnomalyResult( - AnomalyDetector detector, + @Override + public List toIndexableResults( + Config detector, Instant dataStartInstant, Instant dataEndInstant, Instant executionStartInstant, Instant executionEndInstant, List featureData, - Entity entity, + Optional entity, Integer schemaVersion, String modelId, String taskId, String error ) { - return AnomalyResult - .fromRawTRCFResult( - detector.getDetectorId(), - detector.getDetectorIntervalInMilliseconds(), - taskId, - rcfScore, - grade, - confidence, - featureData, - dataStartInstant, - dataEndInstant, - executionStartInstant, - executionEndInstant, - error, - entity, - detector.getUser(), - schemaVersion, - modelId, - relevantAttribution, - relativeIndex, - pastValues, - expectedValuesList, - likelihoodOfValues, - threshold + return Collections + .singletonList( + AnomalyResult + .fromRawTRCFResult( + detector.getId(), + detector.getIntervalInMilliseconds(), + taskId, + rcfScore, + grade, + confidence, + featureData, + dataStartInstant, + dataEndInstant, + executionStartInstant, + executionEndInstant, + error, + entity, + detector.getUser(), + schemaVersion, + modelId, + relevantAttribution, + relativeIndex, + pastValues, + expectedValuesList, + likelihoodOfValues, + threshold + ) ); } } diff --git a/src/main/java/org/opensearch/ad/model/ADEntityTaskProfile.java b/src/main/java/org/opensearch/ad/model/ADEntityTaskProfile.java index 82f300524..3d473d0e2 100644 --- a/src/main/java/org/opensearch/ad/model/ADEntityTaskProfile.java +++ b/src/main/java/org/opensearch/ad/model/ADEntityTaskProfile.java @@ -22,6 +22,7 @@ import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.model.Entity; /** * HC detector's entity task profile. diff --git a/src/main/java/org/opensearch/ad/model/ADTask.java b/src/main/java/org/opensearch/ad/model/ADTask.java index 9d9bbe420..93566a0f0 100644 --- a/src/main/java/org/opensearch/ad/model/ADTask.java +++ b/src/main/java/org/opensearch/ad/model/ADTask.java @@ -11,85 +11,42 @@ package org.opensearch.ad.model; -import static org.opensearch.ad.model.ADTaskState.NOT_ENDED_STATES; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; import java.io.IOException; import java.time.Instant; -import org.opensearch.ad.annotation.Generated; -import org.opensearch.ad.util.ParseUtils; import org.opensearch.commons.authuser.User; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; -import org.opensearch.core.common.io.stream.Writeable; -import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.annotation.Generated; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.TimeSeriesTask; +import org.opensearch.timeseries.util.ParseUtils; import com.google.common.base.Objects; /** * One anomaly detection task means one detector starts to run until stopped. */ -public class ADTask implements ToXContentObject, Writeable { +public class ADTask extends TimeSeriesTask { - public static final String TASK_ID_FIELD = "task_id"; - public static final String LAST_UPDATE_TIME_FIELD = "last_update_time"; - public static final String STARTED_BY_FIELD = "started_by"; - public static final String STOPPED_BY_FIELD = "stopped_by"; - public static final String ERROR_FIELD = "error"; - public static final String STATE_FIELD = "state"; public static final String DETECTOR_ID_FIELD = "detector_id"; - public static final String TASK_PROGRESS_FIELD = "task_progress"; - public static final String INIT_PROGRESS_FIELD = "init_progress"; - public static final String CURRENT_PIECE_FIELD = "current_piece"; - public static final String EXECUTION_START_TIME_FIELD = "execution_start_time"; - public static final String EXECUTION_END_TIME_FIELD = "execution_end_time"; - public static final String IS_LATEST_FIELD = "is_latest"; - public static final String TASK_TYPE_FIELD = "task_type"; - public static final String CHECKPOINT_ID_FIELD = "checkpoint_id"; - public static final String COORDINATING_NODE_FIELD = "coordinating_node"; - public static final String WORKER_NODE_FIELD = "worker_node"; public static final String DETECTOR_FIELD = "detector"; public static final String DETECTION_DATE_RANGE_FIELD = "detection_date_range"; - public static final String ENTITY_FIELD = "entity"; - public static final String PARENT_TASK_ID_FIELD = "parent_task_id"; - public static final String ESTIMATED_MINUTES_LEFT_FIELD = "estimated_minutes_left"; - public static final String USER_FIELD = "user"; - public static final String HISTORICAL_TASK_PREFIX = "HISTORICAL"; - private String taskId = null; - private Instant lastUpdateTime = null; - private String startedBy = null; - private String stoppedBy = null; - private String error = null; - private String state = null; - private String detectorId = null; - private Float taskProgress = null; - private Float initProgress = null; - private Instant currentPiece = null; - private Instant executionStartTime = null; - private Instant executionEndTime = null; - private Boolean isLatest = null; - private String taskType = null; - private String checkpointId = null; private AnomalyDetector detector = null; - - private String coordinatingNode = null; - private String workerNode = null; - private DetectionDateRange detectionDateRange = null; - private Entity entity = null; - private String parentTaskId = null; - private Integer estimatedMinutesLeft = null; - private User user = null; + private DateRange detectionDateRange = null; private ADTask() {} public ADTask(StreamInput input) throws IOException { this.taskId = input.readOptionalString(); this.taskType = input.readOptionalString(); - this.detectorId = input.readOptionalString(); + this.configId = input.readOptionalString(); if (input.readBoolean()) { this.detector = new AnomalyDetector(input); } else { @@ -117,7 +74,7 @@ public ADTask(StreamInput input) throws IOException { // Below are new fields added since AD 1.1 if (input.available() > 0) { if (input.readBoolean()) { - this.detectionDateRange = new DetectionDateRange(input); + this.detectionDateRange = new DateRange(input); } else { this.detectionDateRange = null; } @@ -135,7 +92,7 @@ public ADTask(StreamInput input) throws IOException { public void writeTo(StreamOutput out) throws IOException { out.writeOptionalString(taskId); out.writeOptionalString(taskType); - out.writeOptionalString(detectorId); + out.writeOptionalString(configId); if (detector != null) { out.writeBoolean(true); detector.writeTo(out); @@ -183,175 +140,34 @@ public static Builder builder() { return new Builder(); } - public boolean isHistoricalTask() { - return taskType.startsWith(HISTORICAL_TASK_PREFIX); - } - + @Override public boolean isEntityTask() { return ADTaskType.HISTORICAL_HC_ENTITY.name().equals(taskType); } - /** - * Get detector level task id. If a task has no parent task, the task is detector level task. - * @return detector level task id - */ - public String getDetectorLevelTaskId() { - return getParentTaskId() != null ? getParentTaskId() : getTaskId(); - } - - public boolean isDone() { - return !NOT_ENDED_STATES.contains(this.getState()); - } - - public static class Builder { - private String taskId = null; - private String taskType = null; - private String detectorId = null; + public static class Builder extends TimeSeriesTask.Builder { private AnomalyDetector detector = null; - private String state = null; - private Float taskProgress = null; - private Float initProgress = null; - private Instant currentPiece = null; - private Instant executionStartTime = null; - private Instant executionEndTime = null; - private Boolean isLatest = null; - private String error = null; - private String checkpointId = null; - private Instant lastUpdateTime = null; - private String startedBy = null; - private String stoppedBy = null; - private String coordinatingNode = null; - private String workerNode = null; - private DetectionDateRange detectionDateRange = null; - private Entity entity = null; - private String parentTaskId; - private Integer estimatedMinutesLeft; - private User user = null; + private DateRange detectionDateRange = null; public Builder() {} - public Builder taskId(String taskId) { - this.taskId = taskId; - return this; - } - - public Builder lastUpdateTime(Instant lastUpdateTime) { - this.lastUpdateTime = lastUpdateTime; - return this; - } - - public Builder startedBy(String startedBy) { - this.startedBy = startedBy; - return this; - } - - public Builder stoppedBy(String stoppedBy) { - this.stoppedBy = stoppedBy; - return this; - } - - public Builder error(String error) { - this.error = error; - return this; - } - - public Builder state(String state) { - this.state = state; - return this; - } - - public Builder detectorId(String detectorId) { - this.detectorId = detectorId; - return this; - } - - public Builder taskProgress(Float taskProgress) { - this.taskProgress = taskProgress; - return this; - } - - public Builder initProgress(Float initProgress) { - this.initProgress = initProgress; - return this; - } - - public Builder currentPiece(Instant currentPiece) { - this.currentPiece = currentPiece; - return this; - } - - public Builder executionStartTime(Instant executionStartTime) { - this.executionStartTime = executionStartTime; - return this; - } - - public Builder executionEndTime(Instant executionEndTime) { - this.executionEndTime = executionEndTime; - return this; - } - - public Builder isLatest(Boolean isLatest) { - this.isLatest = isLatest; - return this; - } - - public Builder taskType(String taskType) { - this.taskType = taskType; - return this; - } - - public Builder checkpointId(String checkpointId) { - this.checkpointId = checkpointId; - return this; - } - public Builder detector(AnomalyDetector detector) { this.detector = detector; return this; } - public Builder coordinatingNode(String coordinatingNode) { - this.coordinatingNode = coordinatingNode; - return this; - } - - public Builder workerNode(String workerNode) { - this.workerNode = workerNode; - return this; - } - - public Builder detectionDateRange(DetectionDateRange detectionDateRange) { + public Builder detectionDateRange(DateRange detectionDateRange) { this.detectionDateRange = detectionDateRange; return this; } - public Builder entity(Entity entity) { - this.entity = entity; - return this; - } - - public Builder parentTaskId(String parentTaskId) { - this.parentTaskId = parentTaskId; - return this; - } - - public Builder estimatedMinutesLeft(Integer estimatedMinutesLeft) { - this.estimatedMinutesLeft = estimatedMinutesLeft; - return this; - } - - public Builder user(User user) { - this.user = user; - return this; - } - public ADTask build() { ADTask adTask = new ADTask(); adTask.taskId = this.taskId; adTask.lastUpdateTime = this.lastUpdateTime; adTask.error = this.error; adTask.state = this.state; - adTask.detectorId = this.detectorId; + adTask.configId = this.configId; adTask.taskProgress = this.taskProgress; adTask.initProgress = this.initProgress; adTask.currentPiece = this.currentPiece; @@ -379,56 +195,9 @@ public ADTask build() { @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { XContentBuilder xContentBuilder = builder.startObject(); - if (taskId != null) { - xContentBuilder.field(TASK_ID_FIELD, taskId); - } - if (lastUpdateTime != null) { - xContentBuilder.field(LAST_UPDATE_TIME_FIELD, lastUpdateTime.toEpochMilli()); - } - if (startedBy != null) { - xContentBuilder.field(STARTED_BY_FIELD, startedBy); - } - if (stoppedBy != null) { - xContentBuilder.field(STOPPED_BY_FIELD, stoppedBy); - } - if (error != null) { - xContentBuilder.field(ERROR_FIELD, error); - } - if (state != null) { - xContentBuilder.field(STATE_FIELD, state); - } - if (detectorId != null) { - xContentBuilder.field(DETECTOR_ID_FIELD, detectorId); - } - if (taskProgress != null) { - xContentBuilder.field(TASK_PROGRESS_FIELD, taskProgress); - } - if (initProgress != null) { - xContentBuilder.field(INIT_PROGRESS_FIELD, initProgress); - } - if (currentPiece != null) { - xContentBuilder.field(CURRENT_PIECE_FIELD, currentPiece.toEpochMilli()); - } - if (executionStartTime != null) { - xContentBuilder.field(EXECUTION_START_TIME_FIELD, executionStartTime.toEpochMilli()); - } - if (executionEndTime != null) { - xContentBuilder.field(EXECUTION_END_TIME_FIELD, executionEndTime.toEpochMilli()); - } - if (isLatest != null) { - xContentBuilder.field(IS_LATEST_FIELD, isLatest); - } - if (taskType != null) { - xContentBuilder.field(TASK_TYPE_FIELD, taskType); - } - if (checkpointId != null) { - xContentBuilder.field(CHECKPOINT_ID_FIELD, checkpointId); - } - if (coordinatingNode != null) { - xContentBuilder.field(COORDINATING_NODE_FIELD, coordinatingNode); - } - if (workerNode != null) { - xContentBuilder.field(WORKER_NODE_FIELD, workerNode); + xContentBuilder = super.toXContent(xContentBuilder, params); + if (configId != null) { + xContentBuilder.field(DETECTOR_ID_FIELD, configId); } if (detector != null) { xContentBuilder.field(DETECTOR_FIELD, detector); @@ -436,18 +205,6 @@ public XContentBuilder toXContent(XContentBuilder builder, Params params) throws if (detectionDateRange != null) { xContentBuilder.field(DETECTION_DATE_RANGE_FIELD, detectionDateRange); } - if (entity != null) { - xContentBuilder.field(ENTITY_FIELD, entity); - } - if (parentTaskId != null) { - xContentBuilder.field(PARENT_TASK_ID_FIELD, parentTaskId); - } - if (estimatedMinutesLeft != null) { - xContentBuilder.field(ESTIMATED_MINUTES_LEFT_FIELD, estimatedMinutesLeft); - } - if (user != null) { - xContentBuilder.field(USER_FIELD, user); - } return xContentBuilder.endObject(); } @@ -474,7 +231,7 @@ public static ADTask parse(XContentParser parser, String taskId) throws IOExcept String parsedTaskId = taskId; String coordinatingNode = null; String workerNode = null; - DetectionDateRange detectionDateRange = null; + DateRange detectionDateRange = null; Entity entity = null; String parentTaskId = null; Integer estimatedMinutesLeft = null; @@ -486,73 +243,73 @@ public static ADTask parse(XContentParser parser, String taskId) throws IOExcept parser.nextToken(); switch (fieldName) { - case LAST_UPDATE_TIME_FIELD: + case TimeSeriesTask.LAST_UPDATE_TIME_FIELD: lastUpdateTime = ParseUtils.toInstant(parser); break; - case STARTED_BY_FIELD: + case TimeSeriesTask.STARTED_BY_FIELD: startedBy = parser.text(); break; - case STOPPED_BY_FIELD: + case TimeSeriesTask.STOPPED_BY_FIELD: stoppedBy = parser.text(); break; - case ERROR_FIELD: + case TimeSeriesTask.ERROR_FIELD: error = parser.text(); break; - case STATE_FIELD: + case TimeSeriesTask.STATE_FIELD: state = parser.text(); break; case DETECTOR_ID_FIELD: detectorId = parser.text(); break; - case TASK_PROGRESS_FIELD: + case TimeSeriesTask.TASK_PROGRESS_FIELD: taskProgress = parser.floatValue(); break; - case INIT_PROGRESS_FIELD: + case TimeSeriesTask.INIT_PROGRESS_FIELD: initProgress = parser.floatValue(); break; - case CURRENT_PIECE_FIELD: + case TimeSeriesTask.CURRENT_PIECE_FIELD: currentPiece = ParseUtils.toInstant(parser); break; - case EXECUTION_START_TIME_FIELD: + case TimeSeriesTask.EXECUTION_START_TIME_FIELD: executionStartTime = ParseUtils.toInstant(parser); break; - case EXECUTION_END_TIME_FIELD: + case TimeSeriesTask.EXECUTION_END_TIME_FIELD: executionEndTime = ParseUtils.toInstant(parser); break; - case IS_LATEST_FIELD: + case TimeSeriesTask.IS_LATEST_FIELD: isLatest = parser.booleanValue(); break; - case TASK_TYPE_FIELD: + case TimeSeriesTask.TASK_TYPE_FIELD: taskType = parser.text(); break; - case CHECKPOINT_ID_FIELD: + case TimeSeriesTask.CHECKPOINT_ID_FIELD: checkpointId = parser.text(); break; case DETECTOR_FIELD: detector = AnomalyDetector.parse(parser); break; - case TASK_ID_FIELD: + case TimeSeriesTask.TASK_ID_FIELD: parsedTaskId = parser.text(); break; - case COORDINATING_NODE_FIELD: + case TimeSeriesTask.COORDINATING_NODE_FIELD: coordinatingNode = parser.text(); break; - case WORKER_NODE_FIELD: + case TimeSeriesTask.WORKER_NODE_FIELD: workerNode = parser.text(); break; case DETECTION_DATE_RANGE_FIELD: - detectionDateRange = DetectionDateRange.parse(parser); + detectionDateRange = DateRange.parse(parser); break; - case ENTITY_FIELD: + case TimeSeriesTask.ENTITY_FIELD: entity = Entity.parse(parser); break; - case PARENT_TASK_ID_FIELD: + case TimeSeriesTask.PARENT_TASK_ID_FIELD: parentTaskId = parser.text(); break; - case ESTIMATED_MINUTES_LEFT_FIELD: + case TimeSeriesTask.ESTIMATED_MINUTES_LEFT_FIELD: estimatedMinutesLeft = parser.intValue(); break; - case USER_FIELD: + case TimeSeriesTask.USER_FIELD: user = User.parse(parser); break; default: @@ -571,15 +328,16 @@ public static ADTask parse(XContentParser parser, String taskId) throws IOExcept detector.getIndices(), detector.getFeatureAttributes(), detector.getFilterQuery(), - detector.getDetectionInterval(), + detector.getInterval(), detector.getWindowDelay(), detector.getShingleSize(), detector.getUiMetadata(), detector.getSchemaVersion(), detector.getLastUpdateTime(), - detector.getCategoryField(), + detector.getCategoryFields(), detector.getUser(), - detector.getResultIndex() + detector.getCustomResultIndex(), + detector.getImputationOption() ); return new Builder() .taskId(parsedTaskId) @@ -588,7 +346,7 @@ public static ADTask parse(XContentParser parser, String taskId) throws IOExcept .stoppedBy(stoppedBy) .error(error) .state(state) - .detectorId(detectorId) + .configId(detectorId) .taskProgress(taskProgress) .initProgress(initProgress) .currentPiece(currentPiece) @@ -610,185 +368,35 @@ public static ADTask parse(XContentParser parser, String taskId) throws IOExcept @Generated @Override - public boolean equals(Object o) { - if (this == o) + public boolean equals(Object other) { + if (this == other) return true; - if (o == null || getClass() != o.getClass()) + if (other == null || getClass() != other.getClass()) return false; - ADTask that = (ADTask) o; - return Objects.equal(getTaskId(), that.getTaskId()) - && Objects.equal(getLastUpdateTime(), that.getLastUpdateTime()) - && Objects.equal(getStartedBy(), that.getStartedBy()) - && Objects.equal(getStoppedBy(), that.getStoppedBy()) - && Objects.equal(getError(), that.getError()) - && Objects.equal(getState(), that.getState()) - && Objects.equal(getDetectorId(), that.getDetectorId()) - && Objects.equal(getTaskProgress(), that.getTaskProgress()) - && Objects.equal(getInitProgress(), that.getInitProgress()) - && Objects.equal(getCurrentPiece(), that.getCurrentPiece()) - && Objects.equal(getExecutionStartTime(), that.getExecutionStartTime()) - && Objects.equal(getExecutionEndTime(), that.getExecutionEndTime()) - && Objects.equal(getLatest(), that.getLatest()) - && Objects.equal(getTaskType(), that.getTaskType()) - && Objects.equal(getCheckpointId(), that.getCheckpointId()) - && Objects.equal(getCoordinatingNode(), that.getCoordinatingNode()) - && Objects.equal(getWorkerNode(), that.getWorkerNode()) + ADTask that = (ADTask) other; + return super.equals(that) && Objects.equal(getDetector(), that.getDetector()) - && Objects.equal(getDetectionDateRange(), that.getDetectionDateRange()) - && Objects.equal(getEntity(), that.getEntity()) - && Objects.equal(getParentTaskId(), that.getParentTaskId()) - && Objects.equal(getEstimatedMinutesLeft(), that.getEstimatedMinutesLeft()) - && Objects.equal(getUser(), that.getUser()); + && Objects.equal(getDetectionDateRange(), that.getDetectionDateRange()); } @Generated @Override public int hashCode() { - return Objects - .hashCode( - taskId, - lastUpdateTime, - startedBy, - stoppedBy, - error, - state, - detectorId, - taskProgress, - initProgress, - currentPiece, - executionStartTime, - executionEndTime, - isLatest, - taskType, - checkpointId, - coordinatingNode, - workerNode, - detector, - detectionDateRange, - entity, - parentTaskId, - estimatedMinutesLeft, - user - ); - } - - public String getTaskId() { - return taskId; - } - - public void setTaskId(String taskId) { - this.taskId = taskId; - } - - public Instant getLastUpdateTime() { - return lastUpdateTime; - } - - public String getStartedBy() { - return startedBy; - } - - public String getStoppedBy() { - return stoppedBy; - } - - public String getError() { - return error; - } - - public void setError(String error) { - this.error = error; - } - - public String getState() { - return state; - } - - public void setState(String state) { - this.state = state; - } - - public String getDetectorId() { - return detectorId; - } - - public Float getTaskProgress() { - return taskProgress; - } - - public Float getInitProgress() { - return initProgress; - } - - public Instant getCurrentPiece() { - return currentPiece; - } - - public Instant getExecutionStartTime() { - return executionStartTime; - } - - public Instant getExecutionEndTime() { - return executionEndTime; - } - - public Boolean getLatest() { - return isLatest; - } - - public String getTaskType() { - return taskType; - } - - public String getCheckpointId() { - return checkpointId; + int superHashCode = super.hashCode(); + int hash = Objects.hashCode(configId, detector, detectionDateRange); + hash += 89 * superHashCode; + return hash; } public AnomalyDetector getDetector() { return detector; } - public String getCoordinatingNode() { - return coordinatingNode; - } - - public String getWorkerNode() { - return workerNode; - } - - public DetectionDateRange getDetectionDateRange() { + public DateRange getDetectionDateRange() { return detectionDateRange; } - public Entity getEntity() { - return entity; - } - - public String getEntityModelId() { - return entity == null ? null : entity.getModelId(getDetectorId()).orElse(null); - } - - public String getParentTaskId() { - return parentTaskId; - } - - public Integer getEstimatedMinutesLeft() { - return estimatedMinutesLeft; - } - - public User getUser() { - return user; - } - - public void setDetectionDateRange(DetectionDateRange detectionDateRange) { + public void setDetectionDateRange(DateRange detectionDateRange) { this.detectionDateRange = detectionDateRange; } - - public void setLatest(Boolean latest) { - isLatest = latest; - } - - public void setLastUpdateTime(Instant lastUpdateTime) { - this.lastUpdateTime = lastUpdateTime; - } } diff --git a/src/main/java/org/opensearch/ad/model/ADTaskProfile.java b/src/main/java/org/opensearch/ad/model/ADTaskProfile.java index ff3cd1116..cd6eaeaa0 100644 --- a/src/main/java/org/opensearch/ad/model/ADTaskProfile.java +++ b/src/main/java/org/opensearch/ad/model/ADTaskProfile.java @@ -19,14 +19,13 @@ import java.util.Objects; import org.opensearch.Version; -import org.opensearch.ad.annotation.Generated; -import org.opensearch.ad.cluster.ADVersionUtil; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.common.io.stream.Writeable; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.annotation.Generated; /** * One anomaly detection task means one detector starts to run until stopped. @@ -181,7 +180,7 @@ public void writeTo(StreamOutput out, Version adVersion) throws IOException { out.writeOptionalInt(thresholdModelTrainingDataSize); out.writeOptionalLong(modelSizeInBytes); out.writeOptionalString(nodeId); - if (ADVersionUtil.compatibleWithVersionOnOrAfter1_1(adVersion)) { + if (adVersion != null) { out.writeOptionalString(taskId); out.writeOptionalString(adTaskType); out.writeOptionalInt(detectorTaskSlots); diff --git a/src/main/java/org/opensearch/ad/model/ADTaskType.java b/src/main/java/org/opensearch/ad/model/ADTaskType.java index b4e06aefc..d235bad7e 100644 --- a/src/main/java/org/opensearch/ad/model/ADTaskType.java +++ b/src/main/java/org/opensearch/ad/model/ADTaskType.java @@ -12,11 +12,12 @@ package org.opensearch.ad.model; import java.util.List; -import java.util.stream.Collectors; + +import org.opensearch.timeseries.model.TaskType; import com.google.common.collect.ImmutableList; -public enum ADTaskType { +public enum ADTaskType implements TaskType { @Deprecated HISTORICAL, REALTIME_SINGLE_ENTITY, @@ -41,8 +42,4 @@ public enum ADTaskType { ADTaskType.HISTORICAL_HC_DETECTOR, ADTaskType.HISTORICAL ); - - public static List taskTypeToString(List adTaskTypes) { - return adTaskTypes.stream().map(type -> type.name()).collect(Collectors.toList()); - } } diff --git a/src/main/java/org/opensearch/ad/model/AnomalyDetector.java b/src/main/java/org/opensearch/ad/model/AnomalyDetector.java index 679647b3c..aa86fa842 100644 --- a/src/main/java/org/opensearch/ad/model/AnomalyDetector.java +++ b/src/main/java/org/opensearch/ad/model/AnomalyDetector.java @@ -11,52 +11,46 @@ package org.opensearch.ad.model; -import static org.opensearch.ad.constant.CommonErrorMessages.INVALID_CHAR_IN_RESULT_INDEX_NAME; -import static org.opensearch.ad.constant.CommonErrorMessages.INVALID_RESULT_INDEX_NAME_SIZE; -import static org.opensearch.ad.constant.CommonErrorMessages.INVALID_RESULT_INDEX_PREFIX; -import static org.opensearch.ad.constant.CommonName.CUSTOM_RESULT_INDEX_PREFIX; +import static org.opensearch.ad.constant.ADCommonName.CUSTOM_RESULT_INDEX_PREFIX; import static org.opensearch.ad.model.AnomalyDetectorType.MULTI_ENTITY; import static org.opensearch.ad.model.AnomalyDetectorType.SINGLE_ENTITY; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.DEFAULT_SHINGLE_SIZE; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; import static org.opensearch.index.query.AbstractQueryBuilder.parseInnerQueryBuilder; import java.io.IOException; -import java.time.Duration; import java.time.Instant; import java.time.temporal.ChronoUnit; import java.util.ArrayList; import java.util.List; import java.util.Map; -import java.util.stream.Collectors; - -import org.apache.logging.log4j.util.Strings; -import org.opensearch.ad.annotation.Generated; -import org.opensearch.ad.common.exception.ADValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.constant.CommonValue; -import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.settings.NumericSetting; -import org.opensearch.ad.util.ParseUtils; + +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.settings.ADNumericSetting; import org.opensearch.common.unit.TimeValue; import org.opensearch.commons.authuser.User; import org.opensearch.core.ParseField; import org.opensearch.core.common.ParsingException; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; -import org.opensearch.core.common.io.stream.Writeable; import org.opensearch.core.xcontent.NamedXContentRegistry; import org.opensearch.core.xcontent.ToXContent; -import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParseException; import org.opensearch.core.xcontent.XContentParser; import org.opensearch.index.query.QueryBuilder; import org.opensearch.index.query.QueryBuilders; - -import com.google.common.base.Objects; -import com.google.common.collect.ImmutableList; +import org.opensearch.timeseries.common.exception.ValidationException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonValue; +import org.opensearch.timeseries.dataprocessor.ImputationOption; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.TimeConfiguration; +import org.opensearch.timeseries.model.ValidationAspect; +import org.opensearch.timeseries.model.ValidationIssueType; +import org.opensearch.timeseries.util.ParseUtils; /** * An AnomalyDetector is used to represent anomaly detection model(RCF) related parameters. @@ -64,7 +58,7 @@ * TODO: Will replace detector config mapping in AD task with detector config setting directly \ * in code rather than config it in anomaly-detection-state.json file. */ -public class AnomalyDetector implements Writeable, ToXContentObject { +public class AnomalyDetector extends Config { public static final String PARSE_FIELD_NAME = "AnomalyDetector"; public static final NamedXContentRegistry.Entry XCONTENT_REGISTRY = new NamedXContentRegistry.Entry( @@ -72,59 +66,23 @@ public class AnomalyDetector implements Writeable, ToXContentObject { new ParseField(PARSE_FIELD_NAME), it -> parse(it) ); - public static final String NO_ID = ""; - public static final String ANOMALY_DETECTORS_INDEX = ".opendistro-anomaly-detectors"; public static final String TYPE = "_doc"; - public static final String QUERY_PARAM_PERIOD_START = "period_start"; - public static final String QUERY_PARAM_PERIOD_END = "period_end"; - public static final String GENERAL_SETTINGS = "general_settings"; - - public static final String NAME_FIELD = "name"; - private static final String DESCRIPTION_FIELD = "description"; - public static final String TIMEFIELD_FIELD = "time_field"; - public static final String INDICES_FIELD = "indices"; - public static final String FILTER_QUERY_FIELD = "filter_query"; - public static final String FEATURE_ATTRIBUTES_FIELD = "feature_attributes"; + // for bwc, we have to keep this field instead of reusing an interval field in the super class. + // otherwise, we won't be able to recognize "detection_interval" field sent from old implementation. public static final String DETECTION_INTERVAL_FIELD = "detection_interval"; - public static final String WINDOW_DELAY_FIELD = "window_delay"; - public static final String SHINGLE_SIZE_FIELD = "shingle_size"; - private static final String LAST_UPDATE_TIME_FIELD = "last_update_time"; - public static final String UI_METADATA_FIELD = "ui_metadata"; - public static final String CATEGORY_FIELD = "category_field"; - public static final String USER_FIELD = "user"; public static final String DETECTOR_TYPE_FIELD = "detector_type"; - public static final String RESULT_INDEX_FIELD = "result_index"; - public static final String AGGREGATION = "aggregation_issue"; - public static final String TIMEOUT = "timeout"; @Deprecated public static final String DETECTION_DATE_RANGE_FIELD = "detection_date_range"; - private final String detectorId; - private final Long version; - private final String name; - private final String description; - private final String timeField; - private final List indices; - private final List featureAttributes; - private final QueryBuilder filterQuery; - private final TimeConfiguration detectionInterval; - private final TimeConfiguration windowDelay; - private final Integer shingleSize; - private final Map uiMetadata; - private final Integer schemaVersion; - private final Instant lastUpdateTime; - private final List categoryFields; - private User user; - private String detectorType; - private String resultIndex; + protected String detectorType; // TODO: support backward compatibility, will remove in future @Deprecated - private DetectionDateRange detectionDateRange; + private DateRange detectionDateRange; - public static final int MAX_RESULT_INDEX_NAME_SIZE = 255; - // OS doesn’t allow uppercase: https://tinyurl.com/yse2xdbx - public static final String RESULT_INDEX_NAME_PATTERN = "[a-z0-9_-]+"; + public static String INVALID_RESULT_INDEX_NAME_SIZE = "Result index name size must contains less than " + + MAX_RESULT_INDEX_NAME_SIZE + + " characters"; /** * Constructor function. @@ -146,6 +104,7 @@ public class AnomalyDetector implements Writeable, ToXContentObject { * @param categoryFields a list of partition fields * @param user user to which detector is associated * @param resultIndex result index + * @param imputationOption interpolation method and optional default values */ public AnomalyDetector( String detectorId, @@ -164,103 +123,58 @@ public AnomalyDetector( Instant lastUpdateTime, List categoryFields, User user, - String resultIndex + String resultIndex, + ImputationOption imputationOption ) { - if (Strings.isBlank(name)) { - throw new ADValidationException( - CommonErrorMessages.EMPTY_DETECTOR_NAME, - DetectorValidationIssueType.NAME, - ValidationAspect.DETECTOR - ); - } - if (Strings.isBlank(timeField)) { - throw new ADValidationException( - CommonErrorMessages.NULL_TIME_FIELD, - DetectorValidationIssueType.TIMEFIELD_FIELD, - ValidationAspect.DETECTOR - ); - } - if (indices == null || indices.isEmpty()) { - throw new ADValidationException( - CommonErrorMessages.EMPTY_INDICES, - DetectorValidationIssueType.INDICES, - ValidationAspect.DETECTOR - ); - } + super( + detectorId, + version, + name, + description, + timeField, + indices, + features, + filterQuery, + windowDelay, + shingleSize, + uiMetadata, + schemaVersion, + lastUpdateTime, + categoryFields, + user, + resultIndex, + detectionInterval, + imputationOption + ); + + checkAndThrowValidationErrors(ValidationAspect.DETECTOR); + if (detectionInterval == null) { - throw new ADValidationException( - CommonErrorMessages.NULL_DETECTION_INTERVAL, - DetectorValidationIssueType.DETECTION_INTERVAL, - ValidationAspect.DETECTOR - ); - } - if (invalidShingleSizeRange(shingleSize)) { - throw new ADValidationException( - "Shingle size must be a positive integer no larger than " - + AnomalyDetectorSettings.MAX_SHINGLE_SIZE - + ". Got " - + shingleSize, - DetectorValidationIssueType.SHINGLE_SIZE_FIELD, - ValidationAspect.DETECTOR - ); + errorMessage = ADCommonMessages.NULL_DETECTION_INTERVAL; + issueType = ValidationIssueType.DETECTION_INTERVAL; + } else if (((IntervalTimeConfiguration) detectionInterval).getInterval() <= 0) { + errorMessage = ADCommonMessages.INVALID_DETECTION_INTERVAL; + issueType = ValidationIssueType.DETECTION_INTERVAL; } - int maxCategoryFields = NumericSetting.maxCategoricalFields(); + + int maxCategoryFields = ADNumericSetting.maxCategoricalFields(); if (categoryFields != null && categoryFields.size() > maxCategoryFields) { - throw new ADValidationException( - CommonErrorMessages.getTooManyCategoricalFieldErr(maxCategoryFields), - DetectorValidationIssueType.CATEGORY, - ValidationAspect.DETECTOR - ); - } - if (((IntervalTimeConfiguration) detectionInterval).getInterval() <= 0) { - throw new ADValidationException( - CommonErrorMessages.INVALID_DETECTION_INTERVAL, - DetectorValidationIssueType.DETECTION_INTERVAL, - ValidationAspect.DETECTOR - ); + errorMessage = CommonMessages.getTooManyCategoricalFieldErr(maxCategoryFields); + issueType = ValidationIssueType.CATEGORY; } - this.detectorId = detectorId; - this.version = version; - this.name = name; - this.description = description; - this.timeField = timeField; - this.indices = indices; - this.featureAttributes = features == null ? ImmutableList.of() : ImmutableList.copyOf(features); - this.filterQuery = filterQuery; - this.detectionInterval = detectionInterval; - this.windowDelay = windowDelay; - this.shingleSize = getShingleSize(shingleSize); - this.uiMetadata = uiMetadata; - this.schemaVersion = schemaVersion; - this.lastUpdateTime = lastUpdateTime; - this.categoryFields = categoryFields; - this.user = user; - this.detectorType = isMultientityDetector(categoryFields) ? MULTI_ENTITY.name() : SINGLE_ENTITY.name(); - this.resultIndex = Strings.trimToNull(resultIndex); - String errorMessage = validateResultIndex(this.resultIndex); - if (errorMessage != null) { - throw new ADValidationException(errorMessage, DetectorValidationIssueType.RESULT_INDEX, ValidationAspect.DETECTOR); - } - } - public static String validateResultIndex(String resultIndex) { - if (resultIndex == null) { - return null; - } - if (!resultIndex.startsWith(CUSTOM_RESULT_INDEX_PREFIX)) { - return INVALID_RESULT_INDEX_PREFIX; - } - if (resultIndex.length() > MAX_RESULT_INDEX_NAME_SIZE) { - return INVALID_RESULT_INDEX_NAME_SIZE; - } - if (!resultIndex.matches(RESULT_INDEX_NAME_PATTERN)) { - return INVALID_CHAR_IN_RESULT_INDEX_NAME; - } - return null; + checkAndThrowValidationErrors(ValidationAspect.DETECTOR); + + this.detectorType = isHC(categoryFields) ? MULTI_ENTITY.name() : SINGLE_ENTITY.name(); } + /* + * For backward compatiblity reason, we cannot use super class + * Config's constructor as we have detectionDateRange and + * detectorType that Config does not have. + */ public AnomalyDetector(StreamInput input) throws IOException { - detectorId = input.readOptionalString(); + id = input.readOptionalString(); version = input.readOptionalLong(); name = input.readString(); description = input.readOptionalString(); @@ -268,7 +182,7 @@ public AnomalyDetector(StreamInput input) throws IOException { indices = input.readStringList(); featureAttributes = input.readList(Feature::new); filterQuery = input.readNamedWriteable(QueryBuilder.class); - detectionInterval = IntervalTimeConfiguration.readFrom(input); + interval = IntervalTimeConfiguration.readFrom(input); windowDelay = IntervalTimeConfiguration.readFrom(input); shingleSize = input.readInt(); schemaVersion = input.readInt(); @@ -280,7 +194,7 @@ public AnomalyDetector(StreamInput input) throws IOException { user = null; } if (input.readBoolean()) { - detectionDateRange = new DetectionDateRange(input); + detectionDateRange = new DateRange(input); } else { detectionDateRange = null; } @@ -290,16 +204,27 @@ public AnomalyDetector(StreamInput input) throws IOException { } else { this.uiMetadata = null; } - resultIndex = input.readOptionalString(); + customResultIndex = input.readOptionalString(); + if (input.readBoolean()) { + this.imputationOption = new ImputationOption(input); + } else { + this.imputationOption = null; + } + this.imputer = createImputer(); } public XContentBuilder toXContent(XContentBuilder builder) throws IOException { return toXContent(builder, ToXContent.EMPTY_PARAMS); } + /* + * For backward compatiblity reason, we cannot use super class + * Config's writeTo as we have detectionDateRange and + * detectorType that Config does not have. + */ @Override public void writeTo(StreamOutput output) throws IOException { - output.writeOptionalString(detectorId); + output.writeOptionalString(id); output.writeOptionalLong(version); output.writeString(name); output.writeOptionalString(description); @@ -307,7 +232,7 @@ public void writeTo(StreamOutput output) throws IOException { output.writeStringCollection(indices); output.writeList(featureAttributes); output.writeNamedWriteable(filterQuery); - detectionInterval.writeTo(output); + interval.writeTo(output); windowDelay.writeTo(output); output.writeInt(shingleSize); output.writeInt(schemaVersion); @@ -332,45 +257,28 @@ public void writeTo(StreamOutput output) throws IOException { } else { output.writeBoolean(false); } - output.writeOptionalString(resultIndex); + output.writeOptionalString(customResultIndex); + if (imputationOption != null) { + output.writeBoolean(true); + imputationOption.writeTo(output); + } else { + output.writeBoolean(false); + } } @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { - XContentBuilder xContentBuilder = builder - .startObject() - .field(NAME_FIELD, name) - .field(DESCRIPTION_FIELD, description) - .field(TIMEFIELD_FIELD, timeField) - .field(INDICES_FIELD, indices.toArray()) - .field(FILTER_QUERY_FIELD, filterQuery) - .field(DETECTION_INTERVAL_FIELD, detectionInterval) - .field(WINDOW_DELAY_FIELD, windowDelay) - .field(SHINGLE_SIZE_FIELD, shingleSize) - .field(CommonName.SCHEMA_VERSION_FIELD, schemaVersion) - .field(FEATURE_ATTRIBUTES_FIELD, featureAttributes.toArray()); - - if (uiMetadata != null && !uiMetadata.isEmpty()) { - xContentBuilder.field(UI_METADATA_FIELD, uiMetadata); - } - if (lastUpdateTime != null) { - xContentBuilder.field(LAST_UPDATE_TIME_FIELD, lastUpdateTime.toEpochMilli()); - } - if (categoryFields != null) { - xContentBuilder.field(CATEGORY_FIELD, categoryFields.toArray()); - } - if (user != null) { - xContentBuilder.field(USER_FIELD, user); - } + XContentBuilder xContentBuilder = builder.startObject(); + xContentBuilder = super.toXContent(xContentBuilder, params); + xContentBuilder.field(DETECTION_INTERVAL_FIELD, interval); + if (detectorType != null) { xContentBuilder.field(DETECTOR_TYPE_FIELD, detectorType); } if (detectionDateRange != null) { xContentBuilder.field(DETECTION_DATE_RANGE_FIELD, detectionDateRange); } - if (resultIndex != null) { - xContentBuilder.field(RESULT_INDEX_FIELD, resultIndex); - } + return xContentBuilder.endObject(); } @@ -437,10 +345,11 @@ public static AnomalyDetector parse( Map uiMetadata = null; Instant lastUpdateTime = null; User user = null; - DetectionDateRange detectionDateRange = null; + DateRange detectionDateRange = null; String resultIndex = null; List categoryField = null; + ImputationOption imputationOption = null; ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.currentToken(), parser); while (parser.nextToken() != XContentParser.Token.END_OBJECT) { @@ -466,7 +375,7 @@ public static AnomalyDetector parse( case UI_METADATA_FIELD: uiMetadata = parser.map(); break; - case CommonName.SCHEMA_VERSION_FIELD: + case org.opensearch.timeseries.constant.CommonName.SCHEMA_VERSION_FIELD: schemaVersion = parser.intValue(); break; case FILTER_QUERY_FIELD: @@ -474,9 +383,9 @@ public static AnomalyDetector parse( try { filterQuery = parseInnerQueryBuilder(parser); } catch (ParsingException | XContentParseException e) { - throw new ADValidationException( + throw new ValidationException( "Custom query error in data filter: " + e.getMessage(), - DetectorValidationIssueType.FILTER_QUERY, + ValidationIssueType.FILTER_QUERY, ValidationAspect.DETECTOR ); } catch (IllegalArgumentException e) { @@ -489,11 +398,10 @@ public static AnomalyDetector parse( try { detectionInterval = TimeConfiguration.parse(parser); } catch (Exception e) { - if (e instanceof IllegalArgumentException - && e.getMessage().contains(CommonErrorMessages.NEGATIVE_TIME_CONFIGURATION)) { - throw new ADValidationException( + if (e instanceof IllegalArgumentException && e.getMessage().contains(CommonMessages.NEGATIVE_TIME_CONFIGURATION)) { + throw new ValidationException( "Detection interval must be a positive integer", - DetectorValidationIssueType.DETECTION_INTERVAL, + ValidationIssueType.DETECTION_INTERVAL, ValidationAspect.DETECTOR ); } @@ -508,9 +416,9 @@ public static AnomalyDetector parse( } } catch (Exception e) { if (e instanceof ParsingException || e instanceof XContentParseException) { - throw new ADValidationException( + throw new ValidationException( "Custom query error: " + e.getMessage(), - DetectorValidationIssueType.FEATURE_ATTRIBUTES, + ValidationIssueType.FEATURE_ATTRIBUTES, ValidationAspect.DETECTOR ); } @@ -521,11 +429,10 @@ public static AnomalyDetector parse( try { windowDelay = TimeConfiguration.parse(parser); } catch (Exception e) { - if (e instanceof IllegalArgumentException - && e.getMessage().contains(CommonErrorMessages.NEGATIVE_TIME_CONFIGURATION)) { - throw new ADValidationException( + if (e instanceof IllegalArgumentException && e.getMessage().contains(CommonMessages.NEGATIVE_TIME_CONFIGURATION)) { + throw new ValidationException( "Window delay interval must be a positive integer", - DetectorValidationIssueType.WINDOW_DELAY, + ValidationIssueType.WINDOW_DELAY, ValidationAspect.DETECTOR ); } @@ -545,11 +452,14 @@ public static AnomalyDetector parse( user = User.parse(parser); break; case DETECTION_DATE_RANGE_FIELD: - detectionDateRange = DetectionDateRange.parse(parser); + detectionDateRange = DateRange.parse(parser); break; case RESULT_INDEX_FIELD: resultIndex = parser.text(); break; + case IMPUTATION_OPTION_FIELD: + imputationOption = ImputationOption.parse(parser); + break; default: parser.skipChildren(); break; @@ -572,197 +482,35 @@ public static AnomalyDetector parse( lastUpdateTime, categoryField, user, - resultIndex + resultIndex, + imputationOption ); detector.setDetectionDateRange(detectionDateRange); return detector; } - @Generated - @Override - public boolean equals(Object o) { - if (this == o) - return true; - if (o == null || getClass() != o.getClass()) - return false; - AnomalyDetector detector = (AnomalyDetector) o; - return Objects.equal(getName(), detector.getName()) - && Objects.equal(getDescription(), detector.getDescription()) - && Objects.equal(getTimeField(), detector.getTimeField()) - && Objects.equal(getIndices(), detector.getIndices()) - && Objects.equal(getFeatureAttributes(), detector.getFeatureAttributes()) - && Objects.equal(getFilterQuery(), detector.getFilterQuery()) - && Objects.equal(getDetectionInterval(), detector.getDetectionInterval()) - && Objects.equal(getWindowDelay(), detector.getWindowDelay()) - && Objects.equal(getShingleSize(), detector.getShingleSize()) - && Objects.equal(getCategoryField(), detector.getCategoryField()) - && Objects.equal(getUser(), detector.getUser()) - && Objects.equal(getResultIndex(), detector.getResultIndex()); - } - - @Generated - @Override - public int hashCode() { - return Objects - .hashCode( - detectorId, - name, - description, - timeField, - indices, - featureAttributes, - detectionInterval, - windowDelay, - shingleSize, - uiMetadata, - schemaVersion, - lastUpdateTime, - user, - detectorType, - resultIndex - ); - } - - public String getDetectorId() { - return detectorId; - } - - public Long getVersion() { - return version; - } - - public String getName() { - return name; - } - - public String getDescription() { - return description; - } - - public String getTimeField() { - return timeField; - } - - public List getIndices() { - return indices; - } - - public List getFeatureAttributes() { - return featureAttributes; - } - - public QueryBuilder getFilterQuery() { - return filterQuery; - } - - /** - * Returns enabled feature ids in the same order in feature attributes. - * - * @return a list of filtered feature ids. - */ - public List getEnabledFeatureIds() { - return featureAttributes.stream().filter(Feature::getEnabled).map(Feature::getId).collect(Collectors.toList()); - } - - public List getEnabledFeatureNames() { - return featureAttributes.stream().filter(Feature::getEnabled).map(Feature::getName).collect(Collectors.toList()); - } - - public TimeConfiguration getDetectionInterval() { - return detectionInterval; - } - - public TimeConfiguration getWindowDelay() { - return windowDelay; - } - - public Integer getShingleSize() { - return shingleSize; - } - - /** - * If the given shingle size is null, return default based on the kind of detector; - * otherwise, return the given shingle size. - * - * TODO: need to deal with the case where customers start with single-entity detector, we set it to 8 by default; - * then cx update it to multi-entity detector, we would still use 8 in this case. Kibana needs to change to - * give the correct shingle size. - * @param customShingleSize Given shingle size - * @return Shingle size - */ - private static Integer getShingleSize(Integer customShingleSize) { - return customShingleSize == null ? DEFAULT_SHINGLE_SIZE : customShingleSize; - } - - public Map getUiMetadata() { - return uiMetadata; - } - - public Integer getSchemaVersion() { - return schemaVersion; - } - - public Instant getLastUpdateTime() { - return lastUpdateTime; - } - - public List getCategoryField() { - return this.categoryFields; - } - - public long getDetectorIntervalInMilliseconds() { - return ((IntervalTimeConfiguration) getDetectionInterval()).toDuration().toMillis(); - } - - public long getDetectorIntervalInSeconds() { - return getDetectorIntervalInMilliseconds() / 1000; - } - - public long getDetectorIntervalInMinutes() { - return getDetectorIntervalInMilliseconds() / 1000 / 60; - } - - public Duration getDetectionIntervalDuration() { - return ((IntervalTimeConfiguration) getDetectionInterval()).toDuration(); - } - - public User getUser() { - return user; - } - - public void setUser(User user) { - this.user = user; - } - public String getDetectorType() { return detectorType; } - public void setDetectionDateRange(DetectionDateRange detectionDateRange) { + public void setDetectionDateRange(DateRange detectionDateRange) { this.detectionDateRange = detectionDateRange; } - public DetectionDateRange getDetectionDateRange() { + public DateRange getDetectionDateRange() { return detectionDateRange; } - public String getResultIndex() { - return resultIndex; - } - - public boolean isMultientityDetector() { - return AnomalyDetector.isMultientityDetector(getCategoryField()); - } - - public boolean isMultiCategoryDetector() { - return categoryFields != null && categoryFields.size() > 1; - } - - private static boolean isMultientityDetector(List categoryFields) { - return categoryFields != null && categoryFields.size() > 0; + @Override + protected ValidationAspect getConfigValidationAspect() { + return ValidationAspect.DETECTOR; } - public boolean invalidShingleSizeRange(Integer shingleSizeToTest) { - return shingleSizeToTest != null && (shingleSizeToTest < 1 || shingleSizeToTest > AnomalyDetectorSettings.MAX_SHINGLE_SIZE); + @Override + public String validateCustomResultIndex(String resultIndex) { + if (resultIndex != null && !resultIndex.startsWith(CUSTOM_RESULT_INDEX_PREFIX)) { + return ADCommonMessages.INVALID_RESULT_INDEX_PREFIX; + } + return super.validateCustomResultIndex(resultIndex); } } diff --git a/src/main/java/org/opensearch/ad/model/AnomalyDetectorExecutionInput.java b/src/main/java/org/opensearch/ad/model/AnomalyDetectorExecutionInput.java index 4a36c741f..b2a45c9bb 100644 --- a/src/main/java/org/opensearch/ad/model/AnomalyDetectorExecutionInput.java +++ b/src/main/java/org/opensearch/ad/model/AnomalyDetectorExecutionInput.java @@ -16,12 +16,12 @@ import java.io.IOException; import java.time.Instant; -import org.opensearch.ad.annotation.Generated; -import org.opensearch.ad.util.ParseUtils; import org.opensearch.core.common.Strings; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.annotation.Generated; +import org.opensearch.timeseries.util.ParseUtils; import com.google.common.base.Objects; @@ -83,7 +83,6 @@ public static AnomalyDetectorExecutionInput parse(XContentParser parser, String periodEnd = ParseUtils.toInstant(parser); break; case DETECTOR_FIELD: - XContentParser.Token token = parser.currentToken(); if (parser.currentToken().equals(XContentParser.Token.START_OBJECT)) { detector = AnomalyDetector.parse(parser, detectorId); } diff --git a/src/main/java/org/opensearch/ad/model/AnomalyResult.java b/src/main/java/org/opensearch/ad/model/AnomalyResult.java index acf90fe26..4ee4e0ee7 100644 --- a/src/main/java/org/opensearch/ad/model/AnomalyResult.java +++ b/src/main/java/org/opensearch/ad/model/AnomalyResult.java @@ -11,40 +11,43 @@ package org.opensearch.ad.model; -import static org.opensearch.ad.constant.CommonName.DUMMY_DETECTOR_ID; -import static org.opensearch.ad.constant.CommonName.SCHEMA_VERSION_FIELD; +import static org.opensearch.ad.constant.ADCommonName.DUMMY_DETECTOR_ID; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; import java.io.IOException; import java.time.Instant; import java.util.ArrayList; import java.util.List; +import java.util.Optional; import org.apache.commons.lang.builder.ToStringBuilder; import org.apache.commons.lang3.StringUtils; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.message.ParameterizedMessage; -import org.opensearch.ad.annotation.Generated; -import org.opensearch.ad.constant.CommonValue; import org.opensearch.ad.ml.ThresholdingResult; -import org.opensearch.ad.util.ParseUtils; import org.opensearch.commons.authuser.User; import org.opensearch.core.ParseField; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; -import org.opensearch.core.common.io.stream.Writeable; import org.opensearch.core.xcontent.NamedXContentRegistry; -import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.annotation.Generated; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.constant.CommonValue; +import org.opensearch.timeseries.model.DataByFeatureId; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.FeatureData; +import org.opensearch.timeseries.model.IndexableResult; +import org.opensearch.timeseries.util.ParseUtils; import com.google.common.base.Objects; /** * Include result returned from RCF model and feature data. */ -public class AnomalyResult implements ToXContentObject, Writeable { +public class AnomalyResult extends IndexableResult { private static final Logger LOG = LogManager.getLogger(ThresholdingResult.class); public static final String PARSE_FIELD_NAME = "AnomalyResult"; public static final NamedXContentRegistry.Entry XCONTENT_REGISTRY = new NamedXContentRegistry.Entry( @@ -56,17 +59,6 @@ public class AnomalyResult implements ToXContentObject, Writeable { public static final String DETECTOR_ID_FIELD = "detector_id"; public static final String ANOMALY_SCORE_FIELD = "anomaly_score"; public static final String ANOMALY_GRADE_FIELD = "anomaly_grade"; - public static final String CONFIDENCE_FIELD = "confidence"; - public static final String FEATURE_DATA_FIELD = "feature_data"; - public static final String DATA_START_TIME_FIELD = "data_start_time"; - public static final String DATA_END_TIME_FIELD = "data_end_time"; - public static final String EXECUTION_START_TIME_FIELD = "execution_start_time"; - public static final String EXECUTION_END_TIME_FIELD = "execution_end_time"; - public static final String ERROR_FIELD = "error"; - public static final String ENTITY_FIELD = "entity"; - public static final String USER_FIELD = "user"; - public static final String TASK_ID_FIELD = "task_id"; - public static final String MODEL_ID_FIELD = "model_id"; public static final String APPROX_ANOMALY_START_FIELD = "approx_anomaly_start_time"; public static final String RELEVANT_ATTRIBUTION_FIELD = "relevant_attribution"; public static final String PAST_VALUES_FIELD = "past_values"; @@ -75,31 +67,8 @@ public class AnomalyResult implements ToXContentObject, Writeable { // unused currently. added since odfe 1.4 public static final String IS_ANOMALY_FIELD = "is_anomaly"; - private final String detectorId; - private final String taskId; private final Double anomalyScore; private final Double anomalyGrade; - private final Double confidence; - private final List featureData; - private final Instant dataStartTime; - private final Instant dataEndTime; - private final Instant executionStartTime; - private final Instant executionEndTime; - private final String error; - private final Entity entity; - private User user; - private final Integer schemaVersion; - /* - * model id for easy aggregations of entities. The front end needs to query - * for entities ordered by the descending order of anomaly grades and the - * number of anomalies. After supporting multi-category fields, it is hard - * to write such queries since the entity information is stored in a nested - * object array. Also, the front end has all code/queries/ helper functions - * in place to rely on a single key per entity combo. This PR adds model id - * to anomaly result to help the transition to multi-categorical field less - * painful. - */ - private final String modelId; /** * the approximate time of current anomaly. We might detect anomaly late. This field @@ -212,6 +181,7 @@ So if we detect anomaly late, we get the baseDimension values from the past (cur // rcf score threshold at the time of writing a result private final Double threshold; + protected final Double confidence; // used when indexing exception or error or an empty result public AnomalyResult( @@ -223,7 +193,7 @@ public AnomalyResult( Instant executionStartTime, Instant executionEndTime, String error, - Entity entity, + Optional entity, User user, Integer schemaVersion, String modelId @@ -253,7 +223,7 @@ public AnomalyResult( } public AnomalyResult( - String detectorId, + String configId, String taskId, Double anomalyScore, Double anomalyGrade, @@ -264,7 +234,7 @@ public AnomalyResult( Instant executionStartTime, Instant executionEndTime, String error, - Entity entity, + Optional entity, User user, Integer schemaVersion, String modelId, @@ -274,21 +244,23 @@ public AnomalyResult( List expectedValuesList, Double threshold ) { - this.detectorId = detectorId; - this.taskId = taskId; + super( + configId, + featureData, + dataStartTime, + dataEndTime, + executionStartTime, + executionEndTime, + error, + entity, + user, + schemaVersion, + modelId, + taskId + ); + this.confidence = confidence; this.anomalyScore = anomalyScore; this.anomalyGrade = anomalyGrade; - this.confidence = confidence; - this.featureData = featureData; - this.dataStartTime = dataStartTime; - this.dataEndTime = dataEndTime; - this.executionStartTime = executionStartTime; - this.executionEndTime = executionEndTime; - this.error = error; - this.entity = entity; - this.user = user; - this.schemaVersion = schemaVersion; - this.modelId = modelId; this.approxAnomalyStartTime = approxAnomalyStartTime; this.relevantAttribution = relevantAttribution; this.pastValues = pastValues; @@ -335,7 +307,7 @@ public static AnomalyResult fromRawTRCFResult( Instant executionStartTime, Instant executionEndTime, String error, - Entity entity, + Optional entity, User user, Integer schemaVersion, String modelId, @@ -449,34 +421,10 @@ public static AnomalyResult fromRawTRCFResult( } public AnomalyResult(StreamInput input) throws IOException { - this.detectorId = input.readString(); + super(input); + this.confidence = input.readDouble(); this.anomalyScore = input.readDouble(); this.anomalyGrade = input.readDouble(); - this.confidence = input.readDouble(); - int featureSize = input.readVInt(); - this.featureData = new ArrayList<>(featureSize); - for (int i = 0; i < featureSize; i++) { - featureData.add(new FeatureData(input)); - } - this.dataStartTime = input.readInstant(); - this.dataEndTime = input.readInstant(); - this.executionStartTime = input.readInstant(); - this.executionEndTime = input.readInstant(); - this.error = input.readOptionalString(); - if (input.readBoolean()) { - this.entity = new Entity(input); - } else { - this.entity = null; - } - if (input.readBoolean()) { - this.user = new User(input); - } else { - user = null; - } - this.schemaVersion = input.readInt(); - this.taskId = input.readOptionalString(); - this.modelId = input.readOptionalString(); - // if anomaly is caused by current input, we don't show approximate time this.approxAnomalyStartTime = input.readOptionalInstant(); @@ -517,29 +465,29 @@ public AnomalyResult(StreamInput input) throws IOException { public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { XContentBuilder xContentBuilder = builder .startObject() - .field(DETECTOR_ID_FIELD, detectorId) - .field(SCHEMA_VERSION_FIELD, schemaVersion); + .field(DETECTOR_ID_FIELD, configId) + .field(CommonName.SCHEMA_VERSION_FIELD, schemaVersion); // In normal AD result, we always pass data start/end times. In custom result index, // we need to write/delete a dummy AD result to verify if user has write permission // to the custom result index. Just pass in null start/end time for this dummy anomaly // result to make sure it won't be queried by mistake. if (dataStartTime != null) { - xContentBuilder.field(DATA_START_TIME_FIELD, dataStartTime.toEpochMilli()); + xContentBuilder.field(CommonName.DATA_START_TIME_FIELD, dataStartTime.toEpochMilli()); } if (dataEndTime != null) { - xContentBuilder.field(DATA_END_TIME_FIELD, dataEndTime.toEpochMilli()); + xContentBuilder.field(CommonName.DATA_END_TIME_FIELD, dataEndTime.toEpochMilli()); } if (featureData != null) { // can be null during preview - xContentBuilder.field(FEATURE_DATA_FIELD, featureData.toArray()); + xContentBuilder.field(CommonName.FEATURE_DATA_FIELD, featureData.toArray()); } if (executionStartTime != null) { // can be null during preview - xContentBuilder.field(EXECUTION_START_TIME_FIELD, executionStartTime.toEpochMilli()); + xContentBuilder.field(CommonName.EXECUTION_START_TIME_FIELD, executionStartTime.toEpochMilli()); } if (executionEndTime != null) { // can be null during preview - xContentBuilder.field(EXECUTION_END_TIME_FIELD, executionEndTime.toEpochMilli()); + xContentBuilder.field(CommonName.EXECUTION_END_TIME_FIELD, executionEndTime.toEpochMilli()); } if (anomalyScore != null && !anomalyScore.isNaN()) { xContentBuilder.field(ANOMALY_SCORE_FIELD, anomalyScore); @@ -548,22 +496,22 @@ public XContentBuilder toXContent(XContentBuilder builder, Params params) throws xContentBuilder.field(ANOMALY_GRADE_FIELD, anomalyGrade); } if (confidence != null && !confidence.isNaN()) { - xContentBuilder.field(CONFIDENCE_FIELD, confidence); + xContentBuilder.field(CommonName.CONFIDENCE_FIELD, confidence); } if (error != null) { - xContentBuilder.field(ERROR_FIELD, error); + xContentBuilder.field(CommonName.ERROR_FIELD, error); } - if (entity != null) { - xContentBuilder.field(ENTITY_FIELD, entity); + if (optionalEntity.isPresent()) { + xContentBuilder.field(CommonName.ENTITY_FIELD, optionalEntity.get()); } if (user != null) { - xContentBuilder.field(USER_FIELD, user); + xContentBuilder.field(CommonName.USER_FIELD, user); } if (taskId != null) { - xContentBuilder.field(TASK_ID_FIELD, taskId); + xContentBuilder.field(CommonName.TASK_ID_FIELD, taskId); } if (modelId != null) { - xContentBuilder.field(MODEL_ID_FIELD, modelId); + xContentBuilder.field(CommonName.MODEL_ID_FIELD, modelId); } // output extra fields such as attribution and expected only when this is an anomaly @@ -626,43 +574,43 @@ public static AnomalyResult parse(XContentParser parser) throws IOException { case ANOMALY_GRADE_FIELD: anomalyGrade = parser.doubleValue(); break; - case CONFIDENCE_FIELD: + case CommonName.CONFIDENCE_FIELD: confidence = parser.doubleValue(); break; - case FEATURE_DATA_FIELD: + case CommonName.FEATURE_DATA_FIELD: ensureExpectedToken(XContentParser.Token.START_ARRAY, parser.currentToken(), parser); while (parser.nextToken() != XContentParser.Token.END_ARRAY) { featureData.add(FeatureData.parse(parser)); } break; - case DATA_START_TIME_FIELD: + case CommonName.DATA_START_TIME_FIELD: dataStartTime = ParseUtils.toInstant(parser); break; - case DATA_END_TIME_FIELD: + case CommonName.DATA_END_TIME_FIELD: dataEndTime = ParseUtils.toInstant(parser); break; - case EXECUTION_START_TIME_FIELD: + case CommonName.EXECUTION_START_TIME_FIELD: executionStartTime = ParseUtils.toInstant(parser); break; - case EXECUTION_END_TIME_FIELD: + case CommonName.EXECUTION_END_TIME_FIELD: executionEndTime = ParseUtils.toInstant(parser); break; - case ERROR_FIELD: + case CommonName.ERROR_FIELD: error = parser.text(); break; - case ENTITY_FIELD: + case CommonName.ENTITY_FIELD: entity = Entity.parse(parser); break; - case USER_FIELD: + case CommonName.USER_FIELD: user = User.parse(parser); break; - case SCHEMA_VERSION_FIELD: + case CommonName.SCHEMA_VERSION_FIELD: schemaVersion = parser.intValue(); break; - case TASK_ID_FIELD: + case CommonName.TASK_ID_FIELD: taskId = parser.text(); break; - case MODEL_ID_FIELD: + case CommonName.MODEL_ID_FIELD: modelId = parser.text(); break; case APPROX_ANOMALY_START_FIELD: @@ -707,7 +655,7 @@ public static AnomalyResult parse(XContentParser parser) throws IOException { executionStartTime, executionEndTime, error, - entity, + Optional.ofNullable(entity), user, schemaVersion, modelId, @@ -722,24 +670,14 @@ public static AnomalyResult parse(XContentParser parser) throws IOException { @Generated @Override public boolean equals(Object o) { - if (this == o) - return true; - if (o == null || getClass() != o.getClass()) + if (!super.equals(o)) + return false; + if (getClass() != o.getClass()) return false; AnomalyResult that = (AnomalyResult) o; - return Objects.equal(detectorId, that.detectorId) - && Objects.equal(taskId, that.taskId) + return Objects.equal(confidence, that.confidence) && Objects.equal(anomalyScore, that.anomalyScore) && Objects.equal(anomalyGrade, that.anomalyGrade) - && Objects.equal(confidence, that.confidence) - && Objects.equal(featureData, that.featureData) - && Objects.equal(dataStartTime, that.dataStartTime) - && Objects.equal(dataEndTime, that.dataEndTime) - && Objects.equal(executionStartTime, that.executionStartTime) - && Objects.equal(executionEndTime, that.executionEndTime) - && Objects.equal(error, that.error) - && Objects.equal(entity, that.entity) - && Objects.equal(modelId, that.modelId) && Objects.equal(approxAnomalyStartTime, that.approxAnomalyStartTime) && Objects.equal(relevantAttribution, that.relevantAttribution) && Objects.equal(pastValues, that.pastValues) @@ -750,60 +688,45 @@ public boolean equals(Object o) { @Generated @Override public int hashCode() { - return Objects + final int prime = 31; + int result = super.hashCode(); + result = prime * result + Objects .hashCode( - detectorId, - taskId, + confidence, anomalyScore, anomalyGrade, - confidence, - featureData, - dataStartTime, - dataEndTime, - executionStartTime, - executionEndTime, - error, - entity, - modelId, approxAnomalyStartTime, relevantAttribution, pastValues, expectedValuesList, threshold ); + return result; } @Generated @Override public String toString() { - return new ToStringBuilder(this) - .append("detectorId", detectorId) - .append("taskId", taskId) - .append("anomalyScore", anomalyScore) - .append("anomalyGrade", anomalyGrade) - .append("confidence", confidence) - .append("featureData", featureData) - .append("dataStartTime", dataStartTime) - .append("dataEndTime", dataEndTime) - .append("executionStartTime", executionStartTime) - .append("executionEndTime", executionEndTime) - .append("error", error) - .append("entity", entity) - .append("modelId", modelId) - .append("approAnomalyStartTime", approxAnomalyStartTime) - .append("relavantAttribution", relevantAttribution) - .append("pastValues", pastValues) - .append("expectedValuesList", StringUtils.join(expectedValuesList, "|")) - .append("threshold", threshold) - .toString(); + return super.toString() + + ", " + + new ToStringBuilder(this) + .append("confidence", confidence) + .append("anomalyScore", anomalyScore) + .append("anomalyGrade", anomalyGrade) + .append("approAnomalyStartTime", approxAnomalyStartTime) + .append("relavantAttribution", relevantAttribution) + .append("pastValues", pastValues) + .append("expectedValuesList", StringUtils.join(expectedValuesList, "|")) + .append("threshold", threshold) + .toString(); } - public String getDetectorId() { - return detectorId; + public Double getConfidence() { + return confidence; } - public String getTaskId() { - return taskId; + public String getDetectorId() { + return configId; } public Double getAnomalyScore() { @@ -814,42 +737,6 @@ public Double getAnomalyGrade() { return anomalyGrade; } - public Double getConfidence() { - return confidence; - } - - public List getFeatureData() { - return featureData; - } - - public Instant getDataStartTime() { - return dataStartTime; - } - - public Instant getDataEndTime() { - return dataEndTime; - } - - public Instant getExecutionStartTime() { - return executionStartTime; - } - - public Instant getExecutionEndTime() { - return executionEndTime; - } - - public String getError() { - return error; - } - - public Entity getEntity() { - return entity; - } - - public String getModelId() { - return modelId; - } - public Instant getApproAnomalyStartTime() { return approxAnomalyStartTime; } @@ -876,6 +763,7 @@ public Double getThreshold() { * @return whether the anomaly result is important when the anomaly grade is not 0 * or error is there. */ + @Override public boolean isHighPriority() { // AnomalyResult.toXContent won't record Double.NaN and thus make it null return (getAnomalyGrade() != null && getAnomalyGrade() > 0) || getError() != null; @@ -883,34 +771,10 @@ public boolean isHighPriority() { @Override public void writeTo(StreamOutput out) throws IOException { - out.writeString(detectorId); + super.writeTo(out); + out.writeDouble(confidence); out.writeDouble(anomalyScore); out.writeDouble(anomalyGrade); - out.writeDouble(confidence); - out.writeVInt(featureData.size()); - for (FeatureData feature : featureData) { - feature.writeTo(out); - } - out.writeInstant(dataStartTime); - out.writeInstant(dataEndTime); - out.writeInstant(executionStartTime); - out.writeInstant(executionEndTime); - out.writeOptionalString(error); - if (entity != null) { - out.writeBoolean(true); - entity.writeTo(out); - } else { - out.writeBoolean(false); - } - if (user != null) { - out.writeBoolean(true); // user exists - user.writeTo(out); - } else { - out.writeBoolean(false); // user does not exist - } - out.writeInt(schemaVersion); - out.writeOptionalString(taskId); - out.writeOptionalString(modelId); out.writeOptionalInstant(approxAnomalyStartTime); @@ -954,7 +818,7 @@ public static AnomalyResult getDummyResult() { null, null, null, - null, + Optional.empty(), null, CommonValue.NO_SCHEMA_VERSION, null diff --git a/src/main/java/org/opensearch/ad/model/AnomalyResultBucket.java b/src/main/java/org/opensearch/ad/model/AnomalyResultBucket.java index be72496b7..121d34f6d 100644 --- a/src/main/java/org/opensearch/ad/model/AnomalyResultBucket.java +++ b/src/main/java/org/opensearch/ad/model/AnomalyResultBucket.java @@ -15,7 +15,6 @@ import java.util.Map; import org.apache.commons.lang.builder.ToStringBuilder; -import org.opensearch.ad.annotation.Generated; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.common.io.stream.Writeable; @@ -23,6 +22,7 @@ import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.search.aggregations.bucket.composite.CompositeAggregation.Bucket; import org.opensearch.search.aggregations.metrics.InternalMax; +import org.opensearch.timeseries.annotation.Generated; import com.google.common.base.Objects; diff --git a/src/main/java/org/opensearch/ad/model/DetectorInternalState.java b/src/main/java/org/opensearch/ad/model/DetectorInternalState.java index b9d86ce38..8630a7ac6 100644 --- a/src/main/java/org/opensearch/ad/model/DetectorInternalState.java +++ b/src/main/java/org/opensearch/ad/model/DetectorInternalState.java @@ -16,13 +16,13 @@ import java.io.IOException; import java.time.Instant; -import org.opensearch.ad.annotation.Generated; -import org.opensearch.ad.util.ParseUtils; import org.opensearch.core.ParseField; import org.opensearch.core.xcontent.NamedXContentRegistry; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.annotation.Generated; +import org.opensearch.timeseries.util.ParseUtils; import com.google.common.base.Objects; diff --git a/src/main/java/org/opensearch/ad/model/DetectorProfile.java b/src/main/java/org/opensearch/ad/model/DetectorProfile.java index b7caa2ee3..77418552e 100644 --- a/src/main/java/org/opensearch/ad/model/DetectorProfile.java +++ b/src/main/java/org/opensearch/ad/model/DetectorProfile.java @@ -16,7 +16,7 @@ import org.apache.commons.lang.builder.EqualsBuilder; import org.apache.commons.lang.builder.HashCodeBuilder; import org.apache.commons.lang.builder.ToStringBuilder; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.common.io.stream.Writeable; @@ -188,41 +188,41 @@ public XContentBuilder toXContent(XContentBuilder builder, Params params) throws XContentBuilder xContentBuilder = builder.startObject(); if (state != null) { - xContentBuilder.field(CommonName.STATE, state); + xContentBuilder.field(ADCommonName.STATE, state); } if (error != null) { - xContentBuilder.field(CommonName.ERROR, error); + xContentBuilder.field(ADCommonName.ERROR, error); } if (modelProfile != null && modelProfile.length > 0) { - xContentBuilder.startArray(CommonName.MODELS); + xContentBuilder.startArray(ADCommonName.MODELS); for (ModelProfileOnNode profile : modelProfile) { profile.toXContent(xContentBuilder, params); } xContentBuilder.endArray(); } if (shingleSize != -1) { - xContentBuilder.field(CommonName.SHINGLE_SIZE, shingleSize); + xContentBuilder.field(ADCommonName.SHINGLE_SIZE, shingleSize); } if (coordinatingNode != null && !coordinatingNode.isEmpty()) { - xContentBuilder.field(CommonName.COORDINATING_NODE, coordinatingNode); + xContentBuilder.field(ADCommonName.COORDINATING_NODE, coordinatingNode); } if (totalSizeInBytes != -1) { - xContentBuilder.field(CommonName.TOTAL_SIZE_IN_BYTES, totalSizeInBytes); + xContentBuilder.field(ADCommonName.TOTAL_SIZE_IN_BYTES, totalSizeInBytes); } if (initProgress != null) { - xContentBuilder.field(CommonName.INIT_PROGRESS, initProgress); + xContentBuilder.field(ADCommonName.INIT_PROGRESS, initProgress); } if (totalEntities != null) { - xContentBuilder.field(CommonName.TOTAL_ENTITIES, totalEntities); + xContentBuilder.field(ADCommonName.TOTAL_ENTITIES, totalEntities); } if (activeEntities != null) { - xContentBuilder.field(CommonName.ACTIVE_ENTITIES, activeEntities); + xContentBuilder.field(ADCommonName.ACTIVE_ENTITIES, activeEntities); } if (adTaskProfile != null) { - xContentBuilder.field(CommonName.AD_TASK, adTaskProfile); + xContentBuilder.field(ADCommonName.AD_TASK, adTaskProfile); } if (modelCount > 0) { - xContentBuilder.field(CommonName.MODEL_COUNT, modelCount); + xContentBuilder.field(ADCommonName.MODEL_COUNT, modelCount); } return xContentBuilder.endObject(); } @@ -428,37 +428,37 @@ public String toString() { ToStringBuilder toStringBuilder = new ToStringBuilder(this); if (state != null) { - toStringBuilder.append(CommonName.STATE, state); + toStringBuilder.append(ADCommonName.STATE, state); } if (error != null) { - toStringBuilder.append(CommonName.ERROR, error); + toStringBuilder.append(ADCommonName.ERROR, error); } if (modelProfile != null && modelProfile.length > 0) { toStringBuilder.append(modelProfile); } if (shingleSize != -1) { - toStringBuilder.append(CommonName.SHINGLE_SIZE, shingleSize); + toStringBuilder.append(ADCommonName.SHINGLE_SIZE, shingleSize); } if (coordinatingNode != null) { - toStringBuilder.append(CommonName.COORDINATING_NODE, coordinatingNode); + toStringBuilder.append(ADCommonName.COORDINATING_NODE, coordinatingNode); } if (totalSizeInBytes != -1) { - toStringBuilder.append(CommonName.TOTAL_SIZE_IN_BYTES, totalSizeInBytes); + toStringBuilder.append(ADCommonName.TOTAL_SIZE_IN_BYTES, totalSizeInBytes); } if (initProgress != null) { - toStringBuilder.append(CommonName.INIT_PROGRESS, initProgress); + toStringBuilder.append(ADCommonName.INIT_PROGRESS, initProgress); } if (totalEntities != null) { - toStringBuilder.append(CommonName.TOTAL_ENTITIES, totalEntities); + toStringBuilder.append(ADCommonName.TOTAL_ENTITIES, totalEntities); } if (activeEntities != null) { - toStringBuilder.append(CommonName.ACTIVE_ENTITIES, activeEntities); + toStringBuilder.append(ADCommonName.ACTIVE_ENTITIES, activeEntities); } if (adTaskProfile != null) { - toStringBuilder.append(CommonName.AD_TASK, adTaskProfile); + toStringBuilder.append(ADCommonName.AD_TASK, adTaskProfile); } if (modelCount > 0) { - toStringBuilder.append(CommonName.MODEL_COUNT, modelCount); + toStringBuilder.append(ADCommonName.MODEL_COUNT, modelCount); } return toStringBuilder.toString(); } diff --git a/src/main/java/org/opensearch/ad/model/DetectorProfileName.java b/src/main/java/org/opensearch/ad/model/DetectorProfileName.java index 2b8f220a3..443066ac8 100644 --- a/src/main/java/org/opensearch/ad/model/DetectorProfileName.java +++ b/src/main/java/org/opensearch/ad/model/DetectorProfileName.java @@ -14,21 +14,21 @@ import java.util.Collection; import java.util.Set; -import org.opensearch.ad.Name; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.timeseries.Name; public enum DetectorProfileName implements Name { - STATE(CommonName.STATE), - ERROR(CommonName.ERROR), - COORDINATING_NODE(CommonName.COORDINATING_NODE), - SHINGLE_SIZE(CommonName.SHINGLE_SIZE), - TOTAL_SIZE_IN_BYTES(CommonName.TOTAL_SIZE_IN_BYTES), - MODELS(CommonName.MODELS), - INIT_PROGRESS(CommonName.INIT_PROGRESS), - TOTAL_ENTITIES(CommonName.TOTAL_ENTITIES), - ACTIVE_ENTITIES(CommonName.ACTIVE_ENTITIES), - AD_TASK(CommonName.AD_TASK); + STATE(ADCommonName.STATE), + ERROR(ADCommonName.ERROR), + COORDINATING_NODE(ADCommonName.COORDINATING_NODE), + SHINGLE_SIZE(ADCommonName.SHINGLE_SIZE), + TOTAL_SIZE_IN_BYTES(ADCommonName.TOTAL_SIZE_IN_BYTES), + MODELS(ADCommonName.MODELS), + INIT_PROGRESS(ADCommonName.INIT_PROGRESS), + TOTAL_ENTITIES(ADCommonName.TOTAL_ENTITIES), + ACTIVE_ENTITIES(ADCommonName.ACTIVE_ENTITIES), + AD_TASK(ADCommonName.AD_TASK); private String name; @@ -48,28 +48,28 @@ public String getName() { public static DetectorProfileName getName(String name) { switch (name) { - case CommonName.STATE: + case ADCommonName.STATE: return STATE; - case CommonName.ERROR: + case ADCommonName.ERROR: return ERROR; - case CommonName.COORDINATING_NODE: + case ADCommonName.COORDINATING_NODE: return COORDINATING_NODE; - case CommonName.SHINGLE_SIZE: + case ADCommonName.SHINGLE_SIZE: return SHINGLE_SIZE; - case CommonName.TOTAL_SIZE_IN_BYTES: + case ADCommonName.TOTAL_SIZE_IN_BYTES: return TOTAL_SIZE_IN_BYTES; - case CommonName.MODELS: + case ADCommonName.MODELS: return MODELS; - case CommonName.INIT_PROGRESS: + case ADCommonName.INIT_PROGRESS: return INIT_PROGRESS; - case CommonName.TOTAL_ENTITIES: + case ADCommonName.TOTAL_ENTITIES: return TOTAL_ENTITIES; - case CommonName.ACTIVE_ENTITIES: + case ADCommonName.ACTIVE_ENTITIES: return ACTIVE_ENTITIES; - case CommonName.AD_TASK: + case ADCommonName.AD_TASK: return AD_TASK; default: - throw new IllegalArgumentException(CommonErrorMessages.UNSUPPORTED_PROFILE_TYPE); + throw new IllegalArgumentException(ADCommonMessages.UNSUPPORTED_PROFILE_TYPE); } } diff --git a/src/main/java/org/opensearch/ad/model/DetectorValidationIssue.java b/src/main/java/org/opensearch/ad/model/DetectorValidationIssue.java index 66650d922..48586e7f8 100644 --- a/src/main/java/org/opensearch/ad/model/DetectorValidationIssue.java +++ b/src/main/java/org/opensearch/ad/model/DetectorValidationIssue.java @@ -19,6 +19,9 @@ import org.opensearch.core.common.io.stream.Writeable; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.ValidationAspect; +import org.opensearch.timeseries.model.ValidationIssueType; import com.google.common.base.Objects; @@ -38,7 +41,7 @@ public class DetectorValidationIssue implements ToXContentObject, Writeable { private static final String SUB_ISSUES_FIELD_NAME = "sub_issues"; private final ValidationAspect aspect; - private final DetectorValidationIssueType type; + private final ValidationIssueType type; private final String message; private Map subIssues; private IntervalTimeConfiguration intervalSuggestion; @@ -47,7 +50,7 @@ public ValidationAspect getAspect() { return aspect; } - public DetectorValidationIssueType getType() { + public ValidationIssueType getType() { return type; } @@ -65,7 +68,7 @@ public IntervalTimeConfiguration getIntervalSuggestion() { public DetectorValidationIssue( ValidationAspect aspect, - DetectorValidationIssueType type, + ValidationIssueType type, String message, Map subIssues, IntervalTimeConfiguration intervalSuggestion @@ -77,13 +80,13 @@ public DetectorValidationIssue( this.intervalSuggestion = intervalSuggestion; } - public DetectorValidationIssue(ValidationAspect aspect, DetectorValidationIssueType type, String message) { + public DetectorValidationIssue(ValidationAspect aspect, ValidationIssueType type, String message) { this(aspect, type, message, null, null); } public DetectorValidationIssue(StreamInput input) throws IOException { aspect = input.readEnum(ValidationAspect.class); - type = input.readEnum(DetectorValidationIssueType.class); + type = input.readEnum(ValidationIssueType.class); message = input.readString(); if (input.readBoolean()) { subIssues = input.readMap(StreamInput::readString, StreamInput::readString); diff --git a/src/main/java/org/opensearch/ad/model/DetectorValidationIssueType.java b/src/main/java/org/opensearch/ad/model/DetectorValidationIssueType.java deleted file mode 100644 index fd7975f3b..000000000 --- a/src/main/java/org/opensearch/ad/model/DetectorValidationIssueType.java +++ /dev/null @@ -1,47 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.model; - -import org.opensearch.ad.Name; - -public enum DetectorValidationIssueType implements Name { - NAME(AnomalyDetector.NAME_FIELD), - TIMEFIELD_FIELD(AnomalyDetector.TIMEFIELD_FIELD), - SHINGLE_SIZE_FIELD(AnomalyDetector.SHINGLE_SIZE_FIELD), - INDICES(AnomalyDetector.INDICES_FIELD), - FEATURE_ATTRIBUTES(AnomalyDetector.FEATURE_ATTRIBUTES_FIELD), - DETECTION_INTERVAL(AnomalyDetector.DETECTION_INTERVAL_FIELD), - CATEGORY(AnomalyDetector.CATEGORY_FIELD), - FILTER_QUERY(AnomalyDetector.FILTER_QUERY_FIELD), - WINDOW_DELAY(AnomalyDetector.WINDOW_DELAY_FIELD), - GENERAL_SETTINGS(AnomalyDetector.GENERAL_SETTINGS), - RESULT_INDEX(AnomalyDetector.RESULT_INDEX_FIELD), - TIMEOUT(AnomalyDetector.TIMEOUT), - AGGREGATION(AnomalyDetector.AGGREGATION); // this is a unique case where aggregation failed due to an issue in core but - // don't want to throw exception - - private String name; - - DetectorValidationIssueType(String name) { - this.name = name; - } - - /** - * Get validation type - * - * @return name - */ - @Override - public String getName() { - return name; - } -} diff --git a/src/main/java/org/opensearch/ad/model/EntityProfile.java b/src/main/java/org/opensearch/ad/model/EntityProfile.java index 7a6078aeb..4f2306e96 100644 --- a/src/main/java/org/opensearch/ad/model/EntityProfile.java +++ b/src/main/java/org/opensearch/ad/model/EntityProfile.java @@ -17,7 +17,7 @@ import org.apache.commons.lang.builder.EqualsBuilder; import org.apache.commons.lang.builder.HashCodeBuilder; import org.apache.commons.lang.builder.ToStringBuilder; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.common.io.stream.Writeable; @@ -168,13 +168,13 @@ public XContentBuilder toXContent(XContentBuilder builder, Params params) throws builder.field(LAST_SAMPLE_TIMESTAMP, lastSampleTimestampMs); } if (initProgress != null) { - builder.field(CommonName.INIT_PROGRESS, initProgress); + builder.field(ADCommonName.INIT_PROGRESS, initProgress); } if (modelProfile != null) { - builder.field(CommonName.MODEL, modelProfile); + builder.field(ADCommonName.MODEL, modelProfile); } if (state != null && state != EntityState.UNKNOWN) { - builder.field(CommonName.STATE, state); + builder.field(ADCommonName.STATE, state); } builder.endObject(); return builder; @@ -213,13 +213,13 @@ public String toString() { builder.append(LAST_SAMPLE_TIMESTAMP, lastSampleTimestampMs); } if (initProgress != null) { - builder.append(CommonName.INIT_PROGRESS, initProgress); + builder.append(ADCommonName.INIT_PROGRESS, initProgress); } if (modelProfile != null) { - builder.append(CommonName.MODELS, modelProfile); + builder.append(ADCommonName.MODELS, modelProfile); } if (state != null && state != EntityState.UNKNOWN) { - builder.append(CommonName.STATE, state); + builder.append(ADCommonName.STATE, state); } return builder.toString(); } diff --git a/src/main/java/org/opensearch/ad/model/EntityProfileName.java b/src/main/java/org/opensearch/ad/model/EntityProfileName.java index 0d01df54e..84fd92987 100644 --- a/src/main/java/org/opensearch/ad/model/EntityProfileName.java +++ b/src/main/java/org/opensearch/ad/model/EntityProfileName.java @@ -14,15 +14,15 @@ import java.util.Collection; import java.util.Set; -import org.opensearch.ad.Name; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.timeseries.Name; public enum EntityProfileName implements Name { - INIT_PROGRESS(CommonName.INIT_PROGRESS), - ENTITY_INFO(CommonName.ENTITY_INFO), - STATE(CommonName.STATE), - MODELS(CommonName.MODELS); + INIT_PROGRESS(ADCommonName.INIT_PROGRESS), + ENTITY_INFO(ADCommonName.ENTITY_INFO), + STATE(ADCommonName.STATE), + MODELS(ADCommonName.MODELS); private String name; @@ -42,16 +42,16 @@ public String getName() { public static EntityProfileName getName(String name) { switch (name) { - case CommonName.INIT_PROGRESS: + case ADCommonName.INIT_PROGRESS: return INIT_PROGRESS; - case CommonName.ENTITY_INFO: + case ADCommonName.ENTITY_INFO: return ENTITY_INFO; - case CommonName.STATE: + case ADCommonName.STATE: return STATE; - case CommonName.MODELS: + case ADCommonName.MODELS: return MODELS; default: - throw new IllegalArgumentException(CommonErrorMessages.UNSUPPORTED_PROFILE_TYPE); + throw new IllegalArgumentException(ADCommonMessages.UNSUPPORTED_PROFILE_TYPE); } } diff --git a/src/main/java/org/opensearch/ad/model/ExpectedValueList.java b/src/main/java/org/opensearch/ad/model/ExpectedValueList.java index ee6a168af..14abc4cc6 100644 --- a/src/main/java/org/opensearch/ad/model/ExpectedValueList.java +++ b/src/main/java/org/opensearch/ad/model/ExpectedValueList.java @@ -25,13 +25,13 @@ import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.DataByFeatureId; import com.google.common.base.Objects; public class ExpectedValueList implements ToXContentObject, Writeable { public static final String LIKELIHOOD_FIELD = "likelihood"; - public static final String VALUE_LIST_FIELD = "value_list"; - private Double likelihood; private List valueList; @@ -52,7 +52,7 @@ public XContentBuilder toXContent(XContentBuilder builder, Params params) throws xContentBuilder.field(LIKELIHOOD_FIELD, likelihood); } if (valueList != null) { - xContentBuilder.field(VALUE_LIST_FIELD, valueList.toArray()); + xContentBuilder.field(CommonName.VALUE_LIST_FIELD, valueList.toArray()); } return xContentBuilder.endObject(); } @@ -75,7 +75,7 @@ public static ExpectedValueList parse(XContentParser parser) throws IOException case LIKELIHOOD_FIELD: likelihood = parser.doubleValue(); break; - case VALUE_LIST_FIELD: + case CommonName.VALUE_LIST_FIELD: ensureExpectedToken(XContentParser.Token.START_ARRAY, parser.currentToken(), parser); while (parser.nextToken() != XContentParser.Token.END_ARRAY) { valueList.add(DataByFeatureId.parse(parser)); diff --git a/src/main/java/org/opensearch/ad/model/ModelProfile.java b/src/main/java/org/opensearch/ad/model/ModelProfile.java index 52c3d9ce3..1d6d0ce85 100644 --- a/src/main/java/org/opensearch/ad/model/ModelProfile.java +++ b/src/main/java/org/opensearch/ad/model/ModelProfile.java @@ -16,13 +16,13 @@ import org.apache.commons.lang.builder.EqualsBuilder; import org.apache.commons.lang.builder.HashCodeBuilder; import org.apache.commons.lang.builder.ToStringBuilder; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.util.Bwc; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.common.io.stream.Writeable; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Entity; /** * Used to show model information in profile API @@ -43,42 +43,26 @@ public ModelProfile(String modelId, Entity entity, long modelSizeInBytes) { public ModelProfile(StreamInput in) throws IOException { this.modelId = in.readString(); - if (Bwc.supportMultiCategoryFields(in.getVersion())) { - if (in.readBoolean()) { - this.entity = new Entity(in); - } else { - this.entity = null; - } + if (in.readBoolean()) { + this.entity = new Entity(in); } else { this.entity = null; } + this.modelSizeInBytes = in.readLong(); - if (!Bwc.supportMultiCategoryFields(in.getVersion())) { - // removed nodeId since Opensearch 1.1 - // read it and do no assignment - in.readString(); - } } @Override public void writeTo(StreamOutput out) throws IOException { out.writeString(modelId); - if (Bwc.supportMultiCategoryFields(out.getVersion())) { - if (entity != null) { - out.writeBoolean(true); - entity.writeTo(out); - } else { - out.writeBoolean(false); - } + if (entity != null) { + out.writeBoolean(true); + entity.writeTo(out); + } else { + out.writeBoolean(false); } out.writeLong(modelSizeInBytes); - // removed nodeId since Opensearch 1.1 - if (!Bwc.supportMultiCategoryFields(out.getVersion())) { - // write empty string for node id as we don't have it - // otherwise, we will get EOFException - out.writeString(CommonName.EMPTY_FIELD); - } } public String getModelId() { @@ -95,7 +79,7 @@ public long getModelSizeInBytes() { @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { - builder.field(CommonName.MODEL_ID_KEY, modelId); + builder.field(CommonName.MODEL_ID_FIELD, modelId); if (entity != null) { builder.field(CommonName.ENTITY_KEY, entity); } @@ -131,7 +115,7 @@ public int hashCode() { @Override public String toString() { ToStringBuilder builder = new ToStringBuilder(this); - builder.append(CommonName.MODEL_ID_KEY, modelId); + builder.append(CommonName.MODEL_ID_FIELD, modelId); if (modelSizeInBytes > 0) { builder.append(CommonName.MODEL_SIZE_IN_BYTES, modelSizeInBytes); } diff --git a/src/main/java/org/opensearch/ad/model/ModelProfileOnNode.java b/src/main/java/org/opensearch/ad/model/ModelProfileOnNode.java index 265018aa6..1e45bcc7a 100644 --- a/src/main/java/org/opensearch/ad/model/ModelProfileOnNode.java +++ b/src/main/java/org/opensearch/ad/model/ModelProfileOnNode.java @@ -16,7 +16,7 @@ import org.apache.commons.lang.builder.EqualsBuilder; import org.apache.commons.lang.builder.HashCodeBuilder; import org.apache.commons.lang.builder.ToStringBuilder; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.common.io.stream.Writeable; @@ -98,7 +98,7 @@ public int hashCode() { @Override public String toString() { ToStringBuilder builder = new ToStringBuilder(this); - builder.append(CommonName.MODEL, modelProfile); + builder.append(ADCommonName.MODEL, modelProfile); builder.append(NODE_ID, nodeId); return builder.toString(); } diff --git a/src/main/java/org/opensearch/ad/ratelimit/BatchWorker.java b/src/main/java/org/opensearch/ad/ratelimit/BatchWorker.java index 3151d5eec..7ba8b4383 100644 --- a/src/main/java/org/opensearch/ad/ratelimit/BatchWorker.java +++ b/src/main/java/org/opensearch/ad/ratelimit/BatchWorker.java @@ -19,14 +19,14 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.opensearch.action.support.ThreadedActionListener; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.breaker.CircuitBreakerService; /** * @@ -46,7 +46,7 @@ public BatchWorker( Setting maxHeapPercentForQueueSetting, ClusterService clusterService, Random random, - ADCircuitBreakerService adCircuitBreakerService, + CircuitBreakerService adCircuitBreakerService, ThreadPool threadPool, Settings settings, float maxQueuedTaskRatio, @@ -111,7 +111,7 @@ protected void execute(Runnable afterProcessCallback, Runnable emptyQueueCallbac ThreadedActionListener listener = new ThreadedActionListener<>( LOG, threadPool, - AnomalyDetectorPlugin.AD_THREAD_POOL_NAME, + TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME, getResponseListener(toProcess, batchRequest), false ); diff --git a/src/main/java/org/opensearch/ad/ratelimit/CheckPointMaintainRequestAdapter.java b/src/main/java/org/opensearch/ad/ratelimit/CheckPointMaintainRequestAdapter.java index 072855069..91382a4b5 100644 --- a/src/main/java/org/opensearch/ad/ratelimit/CheckPointMaintainRequestAdapter.java +++ b/src/main/java/org/opensearch/ad/ratelimit/CheckPointMaintainRequestAdapter.java @@ -62,7 +62,7 @@ public CheckPointMaintainRequestAdapter( } public Optional convert(CheckpointMaintainRequest request) { - String detectorId = request.getDetectorId(); + String detectorId = request.getId(); String modelId = request.getEntityModelId(); Optional> stateToMaintain = cache.get().getForMaintainance(detectorId, modelId); diff --git a/src/main/java/org/opensearch/ad/ratelimit/CheckpointMaintainWorker.java b/src/main/java/org/opensearch/ad/ratelimit/CheckpointMaintainWorker.java index ee9ec4ff7..05f9480a7 100644 --- a/src/main/java/org/opensearch/ad/ratelimit/CheckpointMaintainWorker.java +++ b/src/main/java/org/opensearch/ad/ratelimit/CheckpointMaintainWorker.java @@ -11,8 +11,8 @@ package org.opensearch.ad.ratelimit; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_BATCH_SIZE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS; import java.time.Clock; import java.time.Duration; @@ -23,13 +23,13 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.breaker.CircuitBreakerService; public class CheckpointMaintainWorker extends ScheduledWorker { private static final Logger LOG = LogManager.getLogger(CheckpointMaintainWorker.class); @@ -43,7 +43,7 @@ public CheckpointMaintainWorker( Setting maxHeapPercentForQueueSetting, ClusterService clusterService, Random random, - ADCircuitBreakerService adCircuitBreakerService, + CircuitBreakerService adCircuitBreakerService, ThreadPool threadPool, Settings settings, float maxQueuedTaskRatio, @@ -76,15 +76,15 @@ public CheckpointMaintainWorker( nodeStateManager ); - this.batchSize = CHECKPOINT_WRITE_QUEUE_BATCH_SIZE.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(CHECKPOINT_WRITE_QUEUE_BATCH_SIZE, it -> this.batchSize = it); + this.batchSize = AD_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE, it -> this.batchSize = it); - this.expectedExecutionTimeInMilliSecsPerRequest = AnomalyDetectorSettings.EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS + this.expectedExecutionTimeInMilliSecsPerRequest = AnomalyDetectorSettings.AD_EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS .get(settings); clusterService .getClusterSettings() .addSettingsUpdateConsumer( - EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS, + AD_EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS, it -> this.expectedExecutionTimeInMilliSecsPerRequest = it ); this.adapter = adapter; diff --git a/src/main/java/org/opensearch/ad/ratelimit/CheckpointReadWorker.java b/src/main/java/org/opensearch/ad/ratelimit/CheckpointReadWorker.java index 911476095..d4f1f99af 100644 --- a/src/main/java/org/opensearch/ad/ratelimit/CheckpointReadWorker.java +++ b/src/main/java/org/opensearch/ad/ratelimit/CheckpointReadWorker.java @@ -11,8 +11,8 @@ package org.opensearch.ad.ratelimit; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_BATCH_SIZE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_CONCURRENCY; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_BATCH_SIZE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_CONCURRENCY; import java.time.Clock; import java.time.Duration; @@ -32,14 +32,10 @@ import org.opensearch.action.get.MultiGetItemResponse; import org.opensearch.action.get.MultiGetRequest; import org.opensearch.action.get.MultiGetResponse; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.caching.CacheProvider; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.indices.ADIndex; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.ml.CheckpointDao; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelManager; @@ -47,17 +43,23 @@ import org.opensearch.ad.ml.ThresholdingResult; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.stats.ADStats; -import org.opensearch.ad.stats.StatNames; -import org.opensearch.ad.util.ExceptionUtil; -import org.opensearch.ad.util.ParseUtils; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.index.IndexNotFoundException; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.stats.StatNames; +import org.opensearch.timeseries.util.ExceptionUtil; +import org.opensearch.timeseries.util.ParseUtils; /** * a queue for loading model checkpoint. The read is a multi-get query. Possible results are: @@ -78,7 +80,7 @@ public class CheckpointReadWorker extends BatchWorker maxHeapPercentForQueueSetting, ClusterService clusterService, Random random, - ADCircuitBreakerService adCircuitBreakerService, + CircuitBreakerService adCircuitBreakerService, ThreadPool threadPool, Settings settings, float maxQueuedTaskRatio, @@ -103,7 +105,7 @@ public CheckpointReadWorker( EntityColdStartWorker entityColdStartQueue, ResultWriteWorker resultWriteQueue, NodeStateManager stateManager, - AnomalyDetectionIndices indexUtil, + ADIndexManagement indexUtil, CacheProvider cacheProvider, Duration stateTtl, CheckpointWriteWorker checkpointWriteQueue, @@ -124,9 +126,9 @@ public CheckpointReadWorker( mediumSegmentPruneRatio, lowSegmentPruneRatio, maintenanceFreqConstant, - CHECKPOINT_READ_QUEUE_CONCURRENCY, + AD_CHECKPOINT_READ_QUEUE_CONCURRENCY, executionTtl, - CHECKPOINT_READ_QUEUE_BATCH_SIZE, + AD_CHECKPOINT_READ_QUEUE_BATCH_SIZE, stateTtl, stateManager ); @@ -161,7 +163,7 @@ protected MultiGetRequest toBatchRequest(List toProcess) { if (false == modelId.isPresent()) { continue; } - multiGetRequest.add(new MultiGetRequest.Item(CommonName.CHECKPOINT_INDEX_NAME, modelId.get())); + multiGetRequest.add(new MultiGetRequest.Item(ADCommonName.CHECKPOINT_INDEX_NAME, modelId.get())); } return multiGetRequest; } @@ -246,7 +248,7 @@ protected ActionListener getResponseListener(List> onGetDetector( + private ActionListener> onGetDetector( EntityFeatureRequest origRequest, int index, String detectorId, @@ -354,7 +357,7 @@ private ActionListener> onGetDetector( return; } - AnomalyDetector detector = detectorOptional.get(); + AnomalyDetector detector = (AnomalyDetector) detectorOptional.get(); ModelState modelState = modelManager .processEntityCheckpoint(checkpoint, entity, modelId, detectorId, detector.getShingleSize()); @@ -386,31 +389,35 @@ private ActionListener> onGetDetector( } if (result != null && result.getRcfScore() > 0) { - AnomalyResult resultToSave = result - .toAnomalyResult( + RequestPriority requestPriority = result.getGrade() > 0 ? RequestPriority.HIGH : RequestPriority.MEDIUM; + + List resultsToSave = result + .toIndexableResults( detector, Instant.ofEpochMilli(origRequest.getDataStartTimeMillis()), - Instant.ofEpochMilli(origRequest.getDataStartTimeMillis() + detector.getDetectorIntervalInMilliseconds()), + Instant.ofEpochMilli(origRequest.getDataStartTimeMillis() + detector.getIntervalInMilliseconds()), Instant.now(), Instant.now(), ParseUtils.getFeatureData(origRequest.getCurrentFeature(), detector), - entity, + Optional.ofNullable(entity), indexUtil.getSchemaVersion(ADIndex.RESULT), modelId, null, null ); - resultWriteQueue - .put( - new ResultWriteRequest( - origRequest.getExpirationEpochMs(), - detectorId, - result.getGrade() > 0 ? RequestPriority.HIGH : RequestPriority.MEDIUM, - resultToSave, - detector.getResultIndex() - ) - ); + for (AnomalyResult r : resultsToSave) { + resultWriteQueue + .put( + new ResultWriteRequest( + origRequest.getExpirationEpochMs(), + detectorId, + requestPriority, + r, + detector.getCustomResultIndex() + ) + ); + } } // try to load to cache diff --git a/src/main/java/org/opensearch/ad/ratelimit/CheckpointWriteWorker.java b/src/main/java/org/opensearch/ad/ratelimit/CheckpointWriteWorker.java index 51751dccc..a26cb8b94 100644 --- a/src/main/java/org/opensearch/ad/ratelimit/CheckpointWriteWorker.java +++ b/src/main/java/org/opensearch/ad/ratelimit/CheckpointWriteWorker.java @@ -11,8 +11,8 @@ package org.opensearch.ad.ratelimit; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_BATCH_SIZE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_CONCURRENCY; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_CONCURRENCY; import java.time.Clock; import java.time.Duration; @@ -30,19 +30,21 @@ import org.opensearch.action.bulk.BulkRequest; import org.opensearch.action.bulk.BulkResponse; import org.opensearch.action.update.UpdateRequest; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.ml.CheckpointDao; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelState; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.util.ExceptionUtil; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.core.common.Strings; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.util.ExceptionUtil; public class CheckpointWriteWorker extends BatchWorker { private static final Logger LOG = LogManager.getLogger(CheckpointWriteWorker.class); @@ -58,7 +60,7 @@ public CheckpointWriteWorker( Setting maxHeapPercentForQueueSetting, ClusterService clusterService, Random random, - ADCircuitBreakerService adCircuitBreakerService, + CircuitBreakerService adCircuitBreakerService, ThreadPool threadPool, Settings settings, float maxQueuedTaskRatio, @@ -88,9 +90,9 @@ public CheckpointWriteWorker( mediumSegmentPruneRatio, lowSegmentPruneRatio, maintenanceFreqConstant, - CHECKPOINT_WRITE_QUEUE_CONCURRENCY, + AD_CHECKPOINT_WRITE_QUEUE_CONCURRENCY, executionTtl, - CHECKPOINT_WRITE_QUEUE_BATCH_SIZE, + AD_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE, stateTtl, stateManager ); @@ -131,7 +133,7 @@ protected ActionListener getResponseListener(List modelState, boolean forceWrite, Reques } if (modelState.getModel() != null) { - String detectorId = modelState.getDetectorId(); + String detectorId = modelState.getId(); String modelId = modelState.getModelId(); if (modelId == null || detectorId == null) { return; } - nodeStateManager.getAnomalyDetector(detectorId, onGetDetector(detectorId, modelId, modelState, priority)); + nodeStateManager.getConfig(detectorId, AnalysisType.AD, onGetDetector(detectorId, modelId, modelState, priority)); } } - private ActionListener> onGetDetector( + private ActionListener> onGetDetector( String detectorId, String modelId, ModelState modelState, @@ -179,7 +181,7 @@ private ActionListener> onGetDetector( return; } - AnomalyDetector detector = detectorOptional.get(); + AnomalyDetector detector = (AnomalyDetector) detectorOptional.get(); try { Map source = checkpoint.toIndexSource(modelState); @@ -190,7 +192,7 @@ private ActionListener> onGetDetector( modelState.setLastCheckpointTime(clock.instant()); CheckpointWriteRequest request = new CheckpointWriteRequest( - System.currentTimeMillis() + detector.getDetectorIntervalInMilliseconds(), + System.currentTimeMillis() + detector.getIntervalInMilliseconds(), detectorId, priority, // If the document does not already exist, the contents of the upsert element @@ -216,13 +218,13 @@ private ActionListener> onGetDetector( } public void writeAll(List> modelStates, String detectorId, boolean forceWrite, RequestPriority priority) { - ActionListener> onGetForAll = ActionListener.wrap(detectorOptional -> { + ActionListener> onGetForAll = ActionListener.wrap(detectorOptional -> { if (false == detectorOptional.isPresent()) { LOG.warn(new ParameterizedMessage("AnomalyDetector [{}] is not available.", detectorId)); return; } - AnomalyDetector detector = detectorOptional.get(); + AnomalyDetector detector = (AnomalyDetector) detectorOptional.get(); try { List allRequests = new ArrayList<>(); for (ModelState state : modelStates) { @@ -243,7 +245,7 @@ public void writeAll(List> modelStates, String detectorI allRequests .add( new CheckpointWriteRequest( - System.currentTimeMillis() + detector.getDetectorIntervalInMilliseconds(), + System.currentTimeMillis() + detector.getIntervalInMilliseconds(), detectorId, priority, // If the document does not already exist, the contents of the upsert element @@ -269,6 +271,6 @@ public void writeAll(List> modelStates, String detectorI }, exception -> { LOG.error(new ParameterizedMessage("fail to get detector [{}]", detectorId), exception); }); - nodeStateManager.getAnomalyDetector(detectorId, onGetForAll); + nodeStateManager.getConfig(detectorId, AnalysisType.AD, onGetForAll); } } diff --git a/src/main/java/org/opensearch/ad/ratelimit/ColdEntityWorker.java b/src/main/java/org/opensearch/ad/ratelimit/ColdEntityWorker.java index 3a5f52644..701fc25d4 100644 --- a/src/main/java/org/opensearch/ad/ratelimit/ColdEntityWorker.java +++ b/src/main/java/org/opensearch/ad/ratelimit/ColdEntityWorker.java @@ -11,8 +11,8 @@ package org.opensearch.ad.ratelimit; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_BATCH_SIZE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_BATCH_SIZE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS; import java.time.Clock; import java.time.Duration; @@ -20,13 +20,13 @@ import java.util.Random; import java.util.stream.Collectors; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.breaker.CircuitBreakerService; /** * A queue slowly releasing low-priority requests to CheckpointReadQueue @@ -52,7 +52,7 @@ public ColdEntityWorker( Setting maxHeapPercentForQueueSetting, ClusterService clusterService, Random random, - ADCircuitBreakerService adCircuitBreakerService, + CircuitBreakerService adCircuitBreakerService, ThreadPool threadPool, Settings settings, float maxQueuedTaskRatio, @@ -84,15 +84,15 @@ public ColdEntityWorker( nodeStateManager ); - this.batchSize = CHECKPOINT_READ_QUEUE_BATCH_SIZE.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(CHECKPOINT_READ_QUEUE_BATCH_SIZE, it -> this.batchSize = it); + this.batchSize = AD_CHECKPOINT_READ_QUEUE_BATCH_SIZE.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_CHECKPOINT_READ_QUEUE_BATCH_SIZE, it -> this.batchSize = it); - this.expectedExecutionTimeInMilliSecsPerRequest = AnomalyDetectorSettings.EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS + this.expectedExecutionTimeInMilliSecsPerRequest = AnomalyDetectorSettings.AD_EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS .get(settings); clusterService .getClusterSettings() .addSettingsUpdateConsumer( - EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS, + AD_EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS, it -> this.expectedExecutionTimeInMilliSecsPerRequest = it ); } diff --git a/src/main/java/org/opensearch/ad/ratelimit/ConcurrentWorker.java b/src/main/java/org/opensearch/ad/ratelimit/ConcurrentWorker.java index 9861e5056..3df70c935 100644 --- a/src/main/java/org/opensearch/ad/ratelimit/ConcurrentWorker.java +++ b/src/main/java/org/opensearch/ad/ratelimit/ConcurrentWorker.java @@ -19,13 +19,13 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.breaker.CircuitBreakerService; /** * A queue to run concurrent requests (either batch or single request). @@ -74,7 +74,7 @@ public ConcurrentWorker( Setting maxHeapPercentForQueueSetting, ClusterService clusterService, Random random, - ADCircuitBreakerService adCircuitBreakerService, + CircuitBreakerService adCircuitBreakerService, ThreadPool threadPool, Settings settings, float maxQueuedTaskRatio, @@ -132,7 +132,7 @@ public void maintenance() { */ @Override protected void triggerProcess() { - threadPool.executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME).execute(() -> { + threadPool.executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME).execute(() -> { if (permits.tryAcquire()) { try { lastExecuteTime = clock.instant(); diff --git a/src/main/java/org/opensearch/ad/ratelimit/EntityColdStartWorker.java b/src/main/java/org/opensearch/ad/ratelimit/EntityColdStartWorker.java index 214e48abb..72011e156 100644 --- a/src/main/java/org/opensearch/ad/ratelimit/EntityColdStartWorker.java +++ b/src/main/java/org/opensearch/ad/ratelimit/EntityColdStartWorker.java @@ -11,7 +11,7 @@ package org.opensearch.ad.ratelimit; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.ENTITY_COLD_START_QUEUE_CONCURRENCY; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_ENTITY_COLD_START_QUEUE_CONCURRENCY; import java.time.Clock; import java.time.Duration; @@ -23,20 +23,21 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.message.ParameterizedMessage; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.ml.EntityColdStarter; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelManager.ModelType; import org.opensearch.ad.ml.ModelState; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.util.ExceptionUtil; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.util.ExceptionUtil; /** * A queue for HCAD model training (a.k.a. cold start). As model training is a @@ -60,7 +61,7 @@ public EntityColdStartWorker( Setting maxHeapPercentForQueueSetting, ClusterService clusterService, Random random, - ADCircuitBreakerService adCircuitBreakerService, + CircuitBreakerService adCircuitBreakerService, ThreadPool threadPool, Settings settings, float maxQueuedTaskRatio, @@ -89,7 +90,7 @@ public EntityColdStartWorker( mediumSegmentPruneRatio, lowSegmentPruneRatio, maintenanceFreqConstant, - ENTITY_COLD_START_QUEUE_CONCURRENCY, + AD_ENTITY_COLD_START_QUEUE_CONCURRENCY, executionTtl, stateTtl, nodeStateManager @@ -100,7 +101,7 @@ public EntityColdStartWorker( @Override protected void executeRequest(EntityRequest coldStartRequest, ActionListener listener) { - String detectorId = coldStartRequest.getDetectorId(); + String detectorId = coldStartRequest.getId(); Optional modelId = coldStartRequest.getModelId(); @@ -121,7 +122,7 @@ protected void executeRequest(EntityRequest coldStartRequest, ActionListener coldStartListener = ActionListener.wrap(r -> { - nodeStateManager.getAnomalyDetector(detectorId, ActionListener.wrap(detectorOptional -> { + nodeStateManager.getConfig(detectorId, AnalysisType.AD, ActionListener.wrap(detectorOptional -> { try { if (!detectorOptional.isPresent()) { LOG @@ -133,7 +134,7 @@ protected void executeRequest(EntityRequest coldStartRequest, ActionListener requestQueues; private String lastSelectedRequestQueueId; protected Random random; - private ADCircuitBreakerService adCircuitBreakerService; + private CircuitBreakerService adCircuitBreakerService; protected ThreadPool threadPool; protected Instant cooldownStart; protected int coolDownMinutes; @@ -194,7 +194,7 @@ public RateLimitedRequestWorker( Setting maxHeapPercentForQueueSetting, ClusterService clusterService, Random random, - ADCircuitBreakerService adCircuitBreakerService, + CircuitBreakerService adCircuitBreakerService, ThreadPool threadPool, Settings settings, float maxQueuedTaskRatio, @@ -228,7 +228,7 @@ public RateLimitedRequestWorker( this.lastSelectedRequestQueueId = null; this.requestQueues = new ConcurrentSkipListMap<>(); this.cooldownStart = Instant.MIN; - this.coolDownMinutes = (int) (COOLDOWN_MINUTES.get(settings).getMinutes()); + this.coolDownMinutes = (int) (AD_COOLDOWN_MINUTES.get(settings).getMinutes()); this.maintenanceFreqConstant = maintenanceFreqConstant; this.stateTtl = stateTtl; this.nodeStateManager = nodeStateManager; @@ -305,7 +305,7 @@ protected void putOnly(RequestType request) { // just use the RequestQueue priority (i.e., low or high) as the key of the RequestQueue map. RequestQueue requestQueue = requestQueues .computeIfAbsent( - RequestPriority.MEDIUM == request.getPriority() ? request.getDetectorId() : request.getPriority().name(), + RequestPriority.MEDIUM == request.getPriority() ? request.getId() : request.getPriority().name(), k -> new RequestQueue() ); @@ -551,15 +551,15 @@ protected void process() { } catch (Exception e) { LOG.error(new ParameterizedMessage("Fail to process requests in [{}].", this.workerName), e); } - }, new TimeValue(coolDownMinutes, TimeUnit.MINUTES), AnomalyDetectorPlugin.AD_THREAD_POOL_NAME); + }, new TimeValue(coolDownMinutes, TimeUnit.MINUTES), TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME); } else { try { triggerProcess(); } catch (Exception e) { LOG.error(String.format(Locale.ROOT, "Failed to process requests from %s", getWorkerName()), e); - if (e != null && e instanceof AnomalyDetectionException) { - AnomalyDetectionException adExep = (AnomalyDetectionException) e; - nodeStateManager.setException(adExep.getAnomalyDetectorId(), adExep); + if (e != null && e instanceof TimeSeriesException) { + TimeSeriesException adExep = (TimeSeriesException) e; + nodeStateManager.setException(adExep.getConfigId(), adExep); } } diff --git a/src/main/java/org/opensearch/ad/ratelimit/ResultWriteRequest.java b/src/main/java/org/opensearch/ad/ratelimit/ResultWriteRequest.java index 85c569ff2..a25bf3924 100644 --- a/src/main/java/org/opensearch/ad/ratelimit/ResultWriteRequest.java +++ b/src/main/java/org/opensearch/ad/ratelimit/ResultWriteRequest.java @@ -50,7 +50,7 @@ public AnomalyResult getResult() { return result; } - public String getResultIndex() { + public String getCustomResultIndex() { return resultIndex; } } diff --git a/src/main/java/org/opensearch/ad/ratelimit/ResultWriteWorker.java b/src/main/java/org/opensearch/ad/ratelimit/ResultWriteWorker.java index 6ac1a5107..02152b086 100644 --- a/src/main/java/org/opensearch/ad/ratelimit/ResultWriteWorker.java +++ b/src/main/java/org/opensearch/ad/ratelimit/ResultWriteWorker.java @@ -11,8 +11,8 @@ package org.opensearch.ad.ratelimit; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.RESULT_WRITE_QUEUE_BATCH_SIZE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.RESULT_WRITE_QUEUE_CONCURRENCY; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_RESULT_WRITE_QUEUE_BATCH_SIZE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_RESULT_WRITE_QUEUE_CONCURRENCY; import java.time.Clock; import java.time.Duration; @@ -25,14 +25,11 @@ import org.apache.logging.log4j.message.ParameterizedMessage; import org.opensearch.action.DocWriteRequest; import org.opensearch.action.index.IndexRequest; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.ad.transport.ADResultBulkRequest; import org.opensearch.ad.transport.ADResultBulkResponse; import org.opensearch.ad.transport.handler.MultiEntityResultHandler; -import org.opensearch.ad.util.ExceptionUtil; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; @@ -44,6 +41,11 @@ import org.opensearch.core.xcontent.NamedXContentRegistry; import org.opensearch.core.xcontent.XContentParser; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.util.ExceptionUtil; public class ResultWriteWorker extends BatchWorker { private static final Logger LOG = LogManager.getLogger(ResultWriteWorker.class); @@ -58,7 +60,7 @@ public ResultWriteWorker( Setting maxHeapPercentForQueueSetting, ClusterService clusterService, Random random, - ADCircuitBreakerService adCircuitBreakerService, + CircuitBreakerService adCircuitBreakerService, ThreadPool threadPool, Settings settings, float maxQueuedTaskRatio, @@ -87,9 +89,9 @@ public ResultWriteWorker( mediumSegmentPruneRatio, lowSegmentPruneRatio, maintenanceFreqConstant, - RESULT_WRITE_QUEUE_CONCURRENCY, + AD_RESULT_WRITE_QUEUE_CONCURRENCY, executionTtl, - RESULT_WRITE_QUEUE_BATCH_SIZE, + AD_RESULT_WRITE_QUEUE_BATCH_SIZE, stateTtl, stateManager ); @@ -137,7 +139,7 @@ protected ActionListener getResponseListener( } for (ResultWriteRequest request : toProcess) { - nodeStateManager.setException(request.getDetectorId(), exception); + nodeStateManager.setException(request.getId(), exception); } LOG.error("Fail to save results", exception); }); @@ -154,11 +156,11 @@ private void enqueueRetryRequestIteration(List requestToRetry, int return; } AnomalyResult result = resultToRetry.get(); - String detectorId = result.getDetectorId(); - nodeStateManager.getAnomalyDetector(detectorId, onGetDetector(requestToRetry, index, detectorId, result)); + String detectorId = result.getConfigId(); + nodeStateManager.getConfig(detectorId, AnalysisType.AD, onGetDetector(requestToRetry, index, detectorId, result)); } - private ActionListener> onGetDetector( + private ActionListener> onGetDetector( List requestToRetry, int index, String detectorId, @@ -171,15 +173,15 @@ private ActionListener> onGetDetector( return; } - AnomalyDetector detector = detectorOptional.get(); + AnomalyDetector detector = (AnomalyDetector) detectorOptional.get(); super.put( new ResultWriteRequest( // expire based on execute start time - resultToRetry.getExecutionStartTime().toEpochMilli() + detector.getDetectorIntervalInMilliseconds(), + resultToRetry.getExecutionStartTime().toEpochMilli() + detector.getIntervalInMilliseconds(), detectorId, resultToRetry.isHighPriority() ? RequestPriority.HIGH : RequestPriority.MEDIUM, resultToRetry, - detector.getResultIndex() + detector.getCustomResultIndex() ) ); diff --git a/src/main/java/org/opensearch/ad/ratelimit/ScheduledWorker.java b/src/main/java/org/opensearch/ad/ratelimit/ScheduledWorker.java index 1bfeec9af..115d79882 100644 --- a/src/main/java/org/opensearch/ad/ratelimit/ScheduledWorker.java +++ b/src/main/java/org/opensearch/ad/ratelimit/ScheduledWorker.java @@ -18,14 +18,14 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.common.unit.TimeValue; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.breaker.CircuitBreakerService; public abstract class ScheduledWorker extends RateLimitedRequestWorker { @@ -45,7 +45,7 @@ public ScheduledWorker( Setting maxHeapPercentForQueueSetting, ClusterService clusterService, Random random, - ADCircuitBreakerService adCircuitBreakerService, + CircuitBreakerService adCircuitBreakerService, ThreadPool threadPool, Settings settings, float maxQueuedTaskRatio, @@ -114,7 +114,7 @@ private void pullRequests() { private synchronized void schedulePulling(TimeValue delay) { try { - threadPool.schedule(this::pullRequests, delay, AnomalyDetectorPlugin.AD_THREAD_POOL_NAME); + threadPool.schedule(this::pullRequests, delay, TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME); } catch (Exception e) { LOG.error("Fail to schedule cold entity pulling", e); } diff --git a/src/main/java/org/opensearch/ad/ratelimit/SingleRequestWorker.java b/src/main/java/org/opensearch/ad/ratelimit/SingleRequestWorker.java index 478799845..e789e36fa 100644 --- a/src/main/java/org/opensearch/ad/ratelimit/SingleRequestWorker.java +++ b/src/main/java/org/opensearch/ad/ratelimit/SingleRequestWorker.java @@ -19,13 +19,13 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.breaker.CircuitBreakerService; public abstract class SingleRequestWorker extends ConcurrentWorker { private static final Logger LOG = LogManager.getLogger(SingleRequestWorker.class); @@ -37,7 +37,7 @@ public SingleRequestWorker( Setting maxHeapPercentForQueueSetting, ClusterService clusterService, Random random, - ADCircuitBreakerService adCircuitBreakerService, + CircuitBreakerService adCircuitBreakerService, ThreadPool threadPool, Settings settings, float maxQueuedTaskRatio, diff --git a/src/main/java/org/opensearch/ad/rest/AbstractAnomalyDetectorAction.java b/src/main/java/org/opensearch/ad/rest/AbstractAnomalyDetectorAction.java index 331c3151f..ee0d410f5 100644 --- a/src/main/java/org/opensearch/ad/rest/AbstractAnomalyDetectorAction.java +++ b/src/main/java/org/opensearch/ad/rest/AbstractAnomalyDetectorAction.java @@ -11,12 +11,12 @@ package org.opensearch.ad.rest; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_MAX_HC_ANOMALY_DETECTORS; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_MAX_SINGLE_ENTITY_ANOMALY_DETECTORS; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_REQUEST_TIMEOUT; import static org.opensearch.ad.settings.AnomalyDetectorSettings.DETECTION_INTERVAL; import static org.opensearch.ad.settings.AnomalyDetectorSettings.DETECTION_WINDOW_DELAY; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_ANOMALY_FEATURES; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_MULTI_ENTITY_ANOMALY_DETECTORS; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_SINGLE_ENTITY_ANOMALY_DETECTORS; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.REQUEST_TIMEOUT; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Settings; @@ -33,23 +33,21 @@ public abstract class AbstractAnomalyDetectorAction extends BaseRestHandler { protected volatile Integer maxAnomalyFeatures; public AbstractAnomalyDetectorAction(Settings settings, ClusterService clusterService) { - this.requestTimeout = REQUEST_TIMEOUT.get(settings); + this.requestTimeout = AD_REQUEST_TIMEOUT.get(settings); this.detectionInterval = DETECTION_INTERVAL.get(settings); this.detectionWindowDelay = DETECTION_WINDOW_DELAY.get(settings); - this.maxSingleEntityDetectors = MAX_SINGLE_ENTITY_ANOMALY_DETECTORS.get(settings); - this.maxMultiEntityDetectors = MAX_MULTI_ENTITY_ANOMALY_DETECTORS.get(settings); + this.maxSingleEntityDetectors = AD_MAX_SINGLE_ENTITY_ANOMALY_DETECTORS.get(settings); + this.maxMultiEntityDetectors = AD_MAX_HC_ANOMALY_DETECTORS.get(settings); this.maxAnomalyFeatures = MAX_ANOMALY_FEATURES.get(settings); // TODO: will add more cluster setting consumer later // TODO: inject ClusterSettings only if clusterService is only used to get ClusterSettings - clusterService.getClusterSettings().addSettingsUpdateConsumer(REQUEST_TIMEOUT, it -> requestTimeout = it); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_REQUEST_TIMEOUT, it -> requestTimeout = it); clusterService.getClusterSettings().addSettingsUpdateConsumer(DETECTION_INTERVAL, it -> detectionInterval = it); clusterService.getClusterSettings().addSettingsUpdateConsumer(DETECTION_WINDOW_DELAY, it -> detectionWindowDelay = it); clusterService .getClusterSettings() - .addSettingsUpdateConsumer(MAX_SINGLE_ENTITY_ANOMALY_DETECTORS, it -> maxSingleEntityDetectors = it); - clusterService - .getClusterSettings() - .addSettingsUpdateConsumer(MAX_MULTI_ENTITY_ANOMALY_DETECTORS, it -> maxMultiEntityDetectors = it); + .addSettingsUpdateConsumer(AD_MAX_SINGLE_ENTITY_ANOMALY_DETECTORS, it -> maxSingleEntityDetectors = it); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_MAX_HC_ANOMALY_DETECTORS, it -> maxMultiEntityDetectors = it); clusterService.getClusterSettings().addSettingsUpdateConsumer(MAX_ANOMALY_FEATURES, it -> maxAnomalyFeatures = it); } } diff --git a/src/main/java/org/opensearch/ad/rest/AbstractSearchAction.java b/src/main/java/org/opensearch/ad/rest/AbstractSearchAction.java index 6a15e6758..1d0611cf7 100644 --- a/src/main/java/org/opensearch/ad/rest/AbstractSearchAction.java +++ b/src/main/java/org/opensearch/ad/rest/AbstractSearchAction.java @@ -11,8 +11,8 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.util.RestHandlerUtils.getSourceContext; import static org.opensearch.core.xcontent.ToXContent.EMPTY_PARAMS; +import static org.opensearch.timeseries.util.RestHandlerUtils.getSourceContext; import java.io.IOException; import java.util.ArrayList; @@ -24,8 +24,8 @@ import org.opensearch.action.ActionType; import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.client.node.NodeClient; import org.opensearch.core.rest.RestStatus; import org.opensearch.core.xcontent.ToXContentObject; @@ -66,8 +66,8 @@ public AbstractSearchAction( @Override protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient client) throws IOException { - if (!EnabledSetting.isADPluginEnabled()) { - throw new IllegalStateException(CommonErrorMessages.DISABLED_ERR_MSG); + if (!ADEnabledSetting.isADEnabled()) { + throw new IllegalStateException(ADCommonMessages.DISABLED_ERR_MSG); } SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder(); searchSourceBuilder.parseXContent(request.contentOrSourceParamParser()); diff --git a/src/main/java/org/opensearch/ad/rest/RestAnomalyDetectorJobAction.java b/src/main/java/org/opensearch/ad/rest/RestAnomalyDetectorJobAction.java index ffec8eb93..175ac02e7 100644 --- a/src/main/java/org/opensearch/ad/rest/RestAnomalyDetectorJobAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestAnomalyDetectorJobAction.java @@ -11,22 +11,20 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.REQUEST_TIMEOUT; -import static org.opensearch.ad.util.RestHandlerUtils.DETECTOR_ID; -import static org.opensearch.ad.util.RestHandlerUtils.IF_PRIMARY_TERM; -import static org.opensearch.ad.util.RestHandlerUtils.IF_SEQ_NO; -import static org.opensearch.ad.util.RestHandlerUtils.START_JOB; -import static org.opensearch.ad.util.RestHandlerUtils.STOP_JOB; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_REQUEST_TIMEOUT; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.util.RestHandlerUtils.DETECTOR_ID; +import static org.opensearch.timeseries.util.RestHandlerUtils.IF_PRIMARY_TERM; +import static org.opensearch.timeseries.util.RestHandlerUtils.IF_SEQ_NO; +import static org.opensearch.timeseries.util.RestHandlerUtils.START_JOB; +import static org.opensearch.timeseries.util.RestHandlerUtils.STOP_JOB; import java.io.IOException; import java.util.List; import java.util.Locale; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.model.DetectionDateRange; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.transport.AnomalyDetectorJobAction; import org.opensearch.ad.transport.AnomalyDetectorJobRequest; import org.opensearch.client.node.NodeClient; @@ -38,6 +36,8 @@ import org.opensearch.rest.BaseRestHandler; import org.opensearch.rest.RestRequest; import org.opensearch.rest.action.RestToXContentListener; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.model.DateRange; import com.google.common.collect.ImmutableList; @@ -50,8 +50,8 @@ public class RestAnomalyDetectorJobAction extends BaseRestHandler { private volatile TimeValue requestTimeout; public RestAnomalyDetectorJobAction(Settings settings, ClusterService clusterService) { - this.requestTimeout = REQUEST_TIMEOUT.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(REQUEST_TIMEOUT, it -> requestTimeout = it); + this.requestTimeout = AD_REQUEST_TIMEOUT.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_REQUEST_TIMEOUT, it -> requestTimeout = it); } @Override @@ -61,8 +61,8 @@ public String getName() { @Override protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient client) throws IOException { - if (!EnabledSetting.isADPluginEnabled()) { - throw new IllegalStateException(CommonErrorMessages.DISABLED_ERR_MSG); + if (!ADEnabledSetting.isADEnabled()) { + throw new IllegalStateException(ADCommonMessages.DISABLED_ERR_MSG); } String detectorId = request.param(DETECTOR_ID); @@ -70,7 +70,7 @@ protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient cli long primaryTerm = request.paramAsLong(IF_PRIMARY_TERM, SequenceNumbers.UNASSIGNED_PRIMARY_TERM); boolean historical = request.paramAsBoolean("historical", false); String rawPath = request.rawPath(); - DetectionDateRange detectionDateRange = parseDetectionDateRange(request); + DateRange detectionDateRange = parseDetectionDateRange(request); AnomalyDetectorJobRequest anomalyDetectorJobRequest = new AnomalyDetectorJobRequest( detectorId, @@ -85,13 +85,13 @@ protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient cli .execute(AnomalyDetectorJobAction.INSTANCE, anomalyDetectorJobRequest, new RestToXContentListener<>(channel)); } - private DetectionDateRange parseDetectionDateRange(RestRequest request) throws IOException { + private DateRange parseDetectionDateRange(RestRequest request) throws IOException { if (!request.hasContent()) { return null; } XContentParser parser = request.contentParser(); ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - DetectionDateRange dateRange = DetectionDateRange.parse(parser); + DateRange dateRange = DateRange.parse(parser); return dateRange; } @@ -107,16 +107,17 @@ public List replacedRoutes() { // start AD Job new ReplacedRoute( RestRequest.Method.POST, - String.format(Locale.ROOT, "%s/{%s}/%s", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID, START_JOB), + String.format(Locale.ROOT, "%s/{%s}/%s", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID, START_JOB), RestRequest.Method.POST, - String.format(Locale.ROOT, "%s/{%s}/%s", AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID, START_JOB) + String + .format(Locale.ROOT, "%s/{%s}/%s", TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID, START_JOB) ), // stop AD Job new ReplacedRoute( RestRequest.Method.POST, - String.format(Locale.ROOT, "%s/{%s}/%s", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID, STOP_JOB), + String.format(Locale.ROOT, "%s/{%s}/%s", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID, STOP_JOB), RestRequest.Method.POST, - String.format(Locale.ROOT, "%s/{%s}/%s", AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID, STOP_JOB) + String.format(Locale.ROOT, "%s/{%s}/%s", TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID, STOP_JOB) ) ); } diff --git a/src/main/java/org/opensearch/ad/rest/RestDeleteAnomalyDetectorAction.java b/src/main/java/org/opensearch/ad/rest/RestDeleteAnomalyDetectorAction.java index 4baa2d4b5..b7a3aae6c 100644 --- a/src/main/java/org/opensearch/ad/rest/RestDeleteAnomalyDetectorAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestDeleteAnomalyDetectorAction.java @@ -11,7 +11,7 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.util.RestHandlerUtils.DETECTOR_ID; +import static org.opensearch.timeseries.util.RestHandlerUtils.DETECTOR_ID; import java.io.IOException; import java.util.List; @@ -19,16 +19,16 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.rest.handler.AnomalyDetectorActionHandler; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.transport.DeleteAnomalyDetectorAction; import org.opensearch.ad.transport.DeleteAnomalyDetectorRequest; import org.opensearch.client.node.NodeClient; import org.opensearch.rest.BaseRestHandler; import org.opensearch.rest.RestRequest; import org.opensearch.rest.action.RestToXContentListener; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; import com.google.common.collect.ImmutableList; @@ -51,8 +51,8 @@ public String getName() { @Override protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient client) throws IOException { - if (!EnabledSetting.isADPluginEnabled()) { - throw new IllegalStateException(CommonErrorMessages.DISABLED_ERR_MSG); + if (!ADEnabledSetting.isADEnabled()) { + throw new IllegalStateException(ADCommonMessages.DISABLED_ERR_MSG); } String detectorId = request.param(DETECTOR_ID); @@ -73,9 +73,9 @@ public List replacedRoutes() { // delete anomaly detector document new ReplacedRoute( RestRequest.Method.DELETE, - String.format(Locale.ROOT, "%s/{%s}", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID), + String.format(Locale.ROOT, "%s/{%s}", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID), RestRequest.Method.DELETE, - String.format(Locale.ROOT, "%s/{%s}", AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID) + String.format(Locale.ROOT, "%s/{%s}", TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID) ) ); } diff --git a/src/main/java/org/opensearch/ad/rest/RestDeleteAnomalyResultsAction.java b/src/main/java/org/opensearch/ad/rest/RestDeleteAnomalyResultsAction.java index 8fc65c6aa..84b41f8a8 100644 --- a/src/main/java/org/opensearch/ad/rest/RestDeleteAnomalyResultsAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestDeleteAnomalyResultsAction.java @@ -11,7 +11,7 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.indices.AnomalyDetectionIndices.ALL_AD_RESULTS_INDEX_PATTERN; +import static org.opensearch.ad.indices.ADIndexManagement.ALL_AD_RESULTS_INDEX_PATTERN; import java.io.IOException; import java.util.List; @@ -19,9 +19,8 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.opensearch.action.support.IndicesOptions; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.transport.DeleteAnomalyResultsAction; import org.opensearch.client.node.NodeClient; import org.opensearch.core.action.ActionListener; @@ -33,6 +32,7 @@ import org.opensearch.rest.BytesRestResponse; import org.opensearch.rest.RestRequest; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; import com.google.common.collect.ImmutableList; @@ -62,8 +62,8 @@ public String getName() { @Override protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient client) throws IOException { - if (!EnabledSetting.isADPluginEnabled()) { - throw new IllegalStateException(CommonErrorMessages.DISABLED_ERR_MSG); + if (!ADEnabledSetting.isADEnabled()) { + throw new IllegalStateException(ADCommonMessages.DISABLED_ERR_MSG); } SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder(); searchSourceBuilder.parseXContent(request.contentOrSourceParamParser()); @@ -85,6 +85,6 @@ protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient cli @Override public List routes() { - return ImmutableList.of(new Route(RestRequest.Method.DELETE, AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI + "/results")); + return ImmutableList.of(new Route(RestRequest.Method.DELETE, TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI + "/results")); } } diff --git a/src/main/java/org/opensearch/ad/rest/RestExecuteAnomalyDetectorAction.java b/src/main/java/org/opensearch/ad/rest/RestExecuteAnomalyDetectorAction.java index 7166ff918..d52749a02 100644 --- a/src/main/java/org/opensearch/ad/rest/RestExecuteAnomalyDetectorAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestExecuteAnomalyDetectorAction.java @@ -11,10 +11,10 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.REQUEST_TIMEOUT; -import static org.opensearch.ad.util.RestHandlerUtils.DETECTOR_ID; -import static org.opensearch.ad.util.RestHandlerUtils.RUN; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_REQUEST_TIMEOUT; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.util.RestHandlerUtils.DETECTOR_ID; +import static org.opensearch.timeseries.util.RestHandlerUtils.RUN; import java.io.IOException; import java.util.List; @@ -23,10 +23,9 @@ import org.apache.commons.lang.StringUtils; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.model.AnomalyDetectorExecutionInput; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.transport.AnomalyResultAction; import org.opensearch.ad.transport.AnomalyResultRequest; import org.opensearch.client.node.NodeClient; @@ -39,6 +38,7 @@ import org.opensearch.rest.BytesRestResponse; import org.opensearch.rest.RestRequest; import org.opensearch.rest.action.RestToXContentListener; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; import com.google.common.collect.ImmutableList; @@ -54,8 +54,8 @@ public class RestExecuteAnomalyDetectorAction extends BaseRestHandler { private final Logger logger = LogManager.getLogger(RestExecuteAnomalyDetectorAction.class); public RestExecuteAnomalyDetectorAction(Settings settings, ClusterService clusterService) { - this.requestTimeout = REQUEST_TIMEOUT.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(REQUEST_TIMEOUT, it -> requestTimeout = it); + this.requestTimeout = AD_REQUEST_TIMEOUT.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_REQUEST_TIMEOUT, it -> requestTimeout = it); } @Override @@ -65,10 +65,10 @@ public String getName() { @Override protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient client) throws IOException { - if (!EnabledSetting.isADPluginEnabled()) { - throw new IllegalStateException(CommonErrorMessages.DISABLED_ERR_MSG); + if (!ADEnabledSetting.isADEnabled()) { + throw new IllegalStateException(ADCommonMessages.DISABLED_ERR_MSG); } - AnomalyDetectorExecutionInput input = getAnomalyDetectorExecutionInput(request); + AnomalyDetectorExecutionInput input = getConfigExecutionInput(request); return channel -> { String error = validateAdExecutionInput(input); if (StringUtils.isNotBlank(error)) { @@ -85,7 +85,7 @@ protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient cli }; } - private AnomalyDetectorExecutionInput getAnomalyDetectorExecutionInput(RestRequest request) throws IOException { + private AnomalyDetectorExecutionInput getConfigExecutionInput(RestRequest request) throws IOException { String detectorId = null; if (request.hasParam(DETECTOR_ID)) { detectorId = request.param(DETECTOR_ID); @@ -125,9 +125,9 @@ public List replacedRoutes() { // get AD result, for regular run new ReplacedRoute( RestRequest.Method.POST, - String.format(Locale.ROOT, "%s/{%s}/%s", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID, RUN), + String.format(Locale.ROOT, "%s/{%s}/%s", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID, RUN), RestRequest.Method.POST, - String.format(Locale.ROOT, "%s/{%s}/%s", AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID, RUN) + String.format(Locale.ROOT, "%s/{%s}/%s", TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID, RUN) ) ); } diff --git a/src/main/java/org/opensearch/ad/rest/RestGetAnomalyDetectorAction.java b/src/main/java/org/opensearch/ad/rest/RestGetAnomalyDetectorAction.java index eefc52bda..315ba0410 100644 --- a/src/main/java/org/opensearch/ad/rest/RestGetAnomalyDetectorAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestGetAnomalyDetectorAction.java @@ -11,9 +11,9 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.util.RestHandlerUtils.DETECTOR_ID; -import static org.opensearch.ad.util.RestHandlerUtils.PROFILE; -import static org.opensearch.ad.util.RestHandlerUtils.TYPE; +import static org.opensearch.timeseries.util.RestHandlerUtils.DETECTOR_ID; +import static org.opensearch.timeseries.util.RestHandlerUtils.PROFILE; +import static org.opensearch.timeseries.util.RestHandlerUtils.TYPE; import java.io.IOException; import java.util.List; @@ -22,11 +22,9 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.transport.GetAnomalyDetectorAction; import org.opensearch.ad.transport.GetAnomalyDetectorRequest; import org.opensearch.client.node.NodeClient; @@ -35,6 +33,9 @@ import org.opensearch.rest.RestRequest; import org.opensearch.rest.action.RestActions; import org.opensearch.rest.action.RestToXContentListener; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Entity; import com.google.common.collect.ImmutableList; @@ -55,8 +56,8 @@ public String getName() { @Override protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient client) throws IOException { - if (!EnabledSetting.isADPluginEnabled()) { - throw new IllegalStateException(CommonErrorMessages.DISABLED_ERR_MSG); + if (!ADEnabledSetting.isADEnabled()) { + throw new IllegalStateException(ADCommonMessages.DISABLED_ERR_MSG); } String detectorId = request.param(DETECTOR_ID); String typesStr = request.param(TYPE); @@ -65,7 +66,7 @@ protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient cli boolean returnJob = request.paramAsBoolean("job", false); boolean returnTask = request.paramAsBoolean("task", false); boolean all = request.paramAsBoolean("_all", false); - GetAnomalyDetectorRequest getAnomalyDetectorRequest = new GetAnomalyDetectorRequest( + GetAnomalyDetectorRequest getConfigRequest = new GetAnomalyDetectorRequest( detectorId, RestActions.parseVersion(request), returnJob, @@ -76,8 +77,7 @@ protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient cli buildEntity(request, detectorId) ); - return channel -> client - .execute(GetAnomalyDetectorAction.INSTANCE, getAnomalyDetectorRequest, new RestToXContentListener<>(channel)); + return channel -> client.execute(GetAnomalyDetectorAction.INSTANCE, getConfigRequest, new RestToXContentListener<>(channel)); } @Override @@ -87,40 +87,49 @@ public List routes() { // Opensearch-only API. Considering users may provide entity in the search body, support POST as well. new Route( RestRequest.Method.POST, - String.format(Locale.ROOT, "%s/{%s}/%s", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID, PROFILE) + String.format(Locale.ROOT, "%s/{%s}/%s", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID, PROFILE) ), new Route( RestRequest.Method.POST, - String.format(Locale.ROOT, "%s/{%s}/%s/{%s}", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID, PROFILE, TYPE) + String + .format(Locale.ROOT, "%s/{%s}/%s/{%s}", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID, PROFILE, TYPE) ) ); } @Override public List replacedRoutes() { - String path = String.format(Locale.ROOT, "%s/{%s}", AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID); - String newPath = String.format(Locale.ROOT, "%s/{%s}", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID); + String path = String.format(Locale.ROOT, "%s/{%s}", TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID); + String newPath = String.format(Locale.ROOT, "%s/{%s}", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID); return ImmutableList .of( new ReplacedRoute(RestRequest.Method.GET, newPath, RestRequest.Method.GET, path), new ReplacedRoute(RestRequest.Method.HEAD, newPath, RestRequest.Method.HEAD, path), new ReplacedRoute( RestRequest.Method.GET, - String.format(Locale.ROOT, "%s/{%s}/%s", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID, PROFILE), + String.format(Locale.ROOT, "%s/{%s}/%s", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID, PROFILE), RestRequest.Method.GET, - String.format(Locale.ROOT, "%s/{%s}/%s", AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID, PROFILE) + String.format(Locale.ROOT, "%s/{%s}/%s", TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID, PROFILE) ), // types is a profile names. See a complete list of supported profiles names in // org.opensearch.ad.model.ProfileName. new ReplacedRoute( RestRequest.Method.GET, - String.format(Locale.ROOT, "%s/{%s}/%s/{%s}", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID, PROFILE, TYPE), + String + .format( + Locale.ROOT, + "%s/{%s}/%s/{%s}", + TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, + DETECTOR_ID, + PROFILE, + TYPE + ), RestRequest.Method.GET, String .format( Locale.ROOT, "%s/{%s}/%s/{%s}", - AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, + TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID, PROFILE, TYPE @@ -131,10 +140,10 @@ public List replacedRoutes() { private Entity buildEntity(RestRequest request, String detectorId) throws IOException { if (Strings.isEmpty(detectorId)) { - throw new IllegalStateException(CommonErrorMessages.AD_ID_MISSING_MSG); + throw new IllegalStateException(ADCommonMessages.AD_ID_MISSING_MSG); } - String entityName = request.param(CommonName.CATEGORICAL_FIELD); + String entityName = request.param(ADCommonName.CATEGORICAL_FIELD); String entityValue = request.param(CommonName.ENTITY_KEY); if (entityName != null && entityValue != null) { diff --git a/src/main/java/org/opensearch/ad/rest/RestIndexAnomalyDetectorAction.java b/src/main/java/org/opensearch/ad/rest/RestIndexAnomalyDetectorAction.java index 953f0f0f7..6231d8e11 100644 --- a/src/main/java/org/opensearch/ad/rest/RestIndexAnomalyDetectorAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestIndexAnomalyDetectorAction.java @@ -11,11 +11,11 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.util.RestHandlerUtils.DETECTOR_ID; -import static org.opensearch.ad.util.RestHandlerUtils.IF_PRIMARY_TERM; -import static org.opensearch.ad.util.RestHandlerUtils.IF_SEQ_NO; -import static org.opensearch.ad.util.RestHandlerUtils.REFRESH; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.util.RestHandlerUtils.DETECTOR_ID; +import static org.opensearch.timeseries.util.RestHandlerUtils.IF_PRIMARY_TERM; +import static org.opensearch.timeseries.util.RestHandlerUtils.IF_SEQ_NO; +import static org.opensearch.timeseries.util.RestHandlerUtils.REFRESH; import java.io.IOException; import java.util.List; @@ -24,10 +24,9 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.opensearch.action.support.WriteRequest; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.transport.IndexAnomalyDetectorAction; import org.opensearch.ad.transport.IndexAnomalyDetectorRequest; import org.opensearch.ad.transport.IndexAnomalyDetectorResponse; @@ -43,6 +42,7 @@ import org.opensearch.rest.RestRequest; import org.opensearch.rest.RestResponse; import org.opensearch.rest.action.RestResponseListener; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; import com.google.common.collect.ImmutableList; @@ -65,8 +65,8 @@ public String getName() { @Override protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient client) throws IOException { - if (!EnabledSetting.isADPluginEnabled()) { - throw new IllegalStateException(CommonErrorMessages.DISABLED_ERR_MSG); + if (!ADEnabledSetting.isADEnabled()) { + throw new IllegalStateException(ADCommonMessages.DISABLED_ERR_MSG); } String detectorId = request.param(DETECTOR_ID, AnomalyDetector.NO_ID); @@ -113,16 +113,16 @@ public List replacedRoutes() { // Create new ReplacedRoute( RestRequest.Method.POST, - AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, + TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, RestRequest.Method.POST, - AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI + TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI ), // Update new ReplacedRoute( RestRequest.Method.PUT, - String.format(Locale.ROOT, "%s/{%s}", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID), + String.format(Locale.ROOT, "%s/{%s}", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, DETECTOR_ID), RestRequest.Method.PUT, - String.format(Locale.ROOT, "%s/{%s}", AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID) + String.format(Locale.ROOT, "%s/{%s}", TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, DETECTOR_ID) ) ); } @@ -143,7 +143,7 @@ public RestResponse buildResponse(IndexAnomalyDetectorResponse response) throws response.toXContent(channel.newBuilder(), ToXContent.EMPTY_PARAMS) ); if (restStatus == RestStatus.CREATED) { - String location = String.format(Locale.ROOT, "%s/%s", AnomalyDetectorPlugin.LEGACY_AD_BASE, response.getId()); + String location = String.format(Locale.ROOT, "%s/%s", TimeSeriesAnalyticsPlugin.LEGACY_AD_BASE, response.getId()); bytesRestResponse.addHeader("Location", location); } return bytesRestResponse; diff --git a/src/main/java/org/opensearch/ad/rest/RestPreviewAnomalyDetectorAction.java b/src/main/java/org/opensearch/ad/rest/RestPreviewAnomalyDetectorAction.java index 5dc1fafed..cd495517c 100644 --- a/src/main/java/org/opensearch/ad/rest/RestPreviewAnomalyDetectorAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestPreviewAnomalyDetectorAction.java @@ -11,8 +11,8 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.util.RestHandlerUtils.PREVIEW; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.util.RestHandlerUtils.PREVIEW; import java.io.IOException; import java.util.List; @@ -21,13 +21,11 @@ import org.apache.commons.lang.StringUtils; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.model.AnomalyDetectorExecutionInput; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.transport.PreviewAnomalyDetectorAction; import org.opensearch.ad.transport.PreviewAnomalyDetectorRequest; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.core.common.Strings; import org.opensearch.core.rest.RestStatus; import org.opensearch.core.xcontent.XContentParser; @@ -36,6 +34,8 @@ import org.opensearch.rest.RestHandler; import org.opensearch.rest.RestRequest; import org.opensearch.rest.action.RestToXContentListener; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.util.RestHandlerUtils; import com.google.common.collect.ImmutableList; @@ -54,11 +54,11 @@ public String getName() { @Override protected RestChannelConsumer prepareRequest(RestRequest request, org.opensearch.client.node.NodeClient client) throws IOException { - if (!EnabledSetting.isADPluginEnabled()) { - throw new IllegalStateException(CommonErrorMessages.DISABLED_ERR_MSG); + if (!ADEnabledSetting.isADEnabled()) { + throw new IllegalStateException(ADCommonMessages.DISABLED_ERR_MSG); } - AnomalyDetectorExecutionInput input = getAnomalyDetectorExecutionInput(request); + AnomalyDetectorExecutionInput input = getConfigExecutionInput(request); return channel -> { String rawPath = request.rawPath(); @@ -77,7 +77,7 @@ protected RestChannelConsumer prepareRequest(RestRequest request, org.opensearch }; } - private AnomalyDetectorExecutionInput getAnomalyDetectorExecutionInput(RestRequest request) throws IOException { + private AnomalyDetectorExecutionInput getConfigExecutionInput(RestRequest request) throws IOException { String detectorId = null; if (request.hasParam(RestHandlerUtils.DETECTOR_ID)) { detectorId = request.param(RestHandlerUtils.DETECTOR_ID); @@ -109,7 +109,7 @@ public List routes() { // preview detector new Route( RestRequest.Method.POST, - String.format(Locale.ROOT, "%s/%s", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, PREVIEW) + String.format(Locale.ROOT, "%s/%s", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, PREVIEW) ) ); } @@ -125,7 +125,7 @@ public List replacedRoutes() { .format( Locale.ROOT, "%s/{%s}/%s", - AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, + TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, RestHandlerUtils.DETECTOR_ID, PREVIEW ), @@ -134,7 +134,7 @@ public List replacedRoutes() { .format( Locale.ROOT, "%s/{%s}/%s", - AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, + TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, RestHandlerUtils.DETECTOR_ID, PREVIEW ) diff --git a/src/main/java/org/opensearch/ad/rest/RestSearchADTasksAction.java b/src/main/java/org/opensearch/ad/rest/RestSearchADTasksAction.java index 57be1b391..6a1bfce58 100644 --- a/src/main/java/org/opensearch/ad/rest/RestSearchADTasksAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestSearchADTasksAction.java @@ -12,10 +12,10 @@ package org.opensearch.ad.rest; import org.apache.commons.lang3.tuple.Pair; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.transport.SearchADTasksAction; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; import com.google.common.collect.ImmutableList; @@ -24,15 +24,15 @@ */ public class RestSearchADTasksAction extends AbstractSearchAction { - private static final String LEGACY_URL_PATH = AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI + "/tasks/_search"; - private static final String URL_PATH = AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI + "/tasks/_search"; + private static final String LEGACY_URL_PATH = TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI + "/tasks/_search"; + private static final String URL_PATH = TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI + "/tasks/_search"; private final String SEARCH_ANOMALY_DETECTION_TASKS = "search_anomaly_detection_tasks"; public RestSearchADTasksAction() { super( ImmutableList.of(), ImmutableList.of(Pair.of(URL_PATH, LEGACY_URL_PATH)), - CommonName.DETECTION_STATE_INDEX, + ADCommonName.DETECTION_STATE_INDEX, ADTask.class, SearchADTasksAction.INSTANCE ); diff --git a/src/main/java/org/opensearch/ad/rest/RestSearchAnomalyDetectorAction.java b/src/main/java/org/opensearch/ad/rest/RestSearchAnomalyDetectorAction.java index 26fc442e1..214fa8b2c 100644 --- a/src/main/java/org/opensearch/ad/rest/RestSearchAnomalyDetectorAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestSearchAnomalyDetectorAction.java @@ -11,12 +11,11 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; - import org.apache.commons.lang3.tuple.Pair; -import org.opensearch.ad.AnomalyDetectorPlugin; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.transport.SearchAnomalyDetectorAction; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.constant.CommonName; import com.google.common.collect.ImmutableList; @@ -25,15 +24,15 @@ */ public class RestSearchAnomalyDetectorAction extends AbstractSearchAction { - private static final String LEGACY_URL_PATH = AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI + "/_search"; - private static final String URL_PATH = AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI + "/_search"; + private static final String LEGACY_URL_PATH = TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI + "/_search"; + private static final String URL_PATH = TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI + "/_search"; private final String SEARCH_ANOMALY_DETECTOR_ACTION = "search_anomaly_detector"; public RestSearchAnomalyDetectorAction() { super( ImmutableList.of(), ImmutableList.of(Pair.of(URL_PATH, LEGACY_URL_PATH)), - ANOMALY_DETECTORS_INDEX, + CommonName.CONFIG_INDEX, AnomalyDetector.class, SearchAnomalyDetectorAction.INSTANCE ); diff --git a/src/main/java/org/opensearch/ad/rest/RestSearchAnomalyDetectorInfoAction.java b/src/main/java/org/opensearch/ad/rest/RestSearchAnomalyDetectorInfoAction.java index 2d99f2510..1f2ade113 100644 --- a/src/main/java/org/opensearch/ad/rest/RestSearchAnomalyDetectorInfoAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestSearchAnomalyDetectorInfoAction.java @@ -11,8 +11,8 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.util.RestHandlerUtils.COUNT; -import static org.opensearch.ad.util.RestHandlerUtils.MATCH; +import static org.opensearch.timeseries.util.RestHandlerUtils.COUNT; +import static org.opensearch.timeseries.util.RestHandlerUtils.MATCH; import java.io.IOException; import java.util.List; @@ -20,15 +20,15 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.transport.SearchAnomalyDetectorInfoAction; import org.opensearch.ad.transport.SearchAnomalyDetectorInfoRequest; import org.opensearch.rest.BaseRestHandler; import org.opensearch.rest.RestHandler; import org.opensearch.rest.RestRequest; import org.opensearch.rest.action.RestToXContentListener; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; import com.google.common.collect.ImmutableList; @@ -47,8 +47,8 @@ public String getName() { @Override protected RestChannelConsumer prepareRequest(RestRequest request, org.opensearch.client.node.NodeClient client) throws IOException { - if (!EnabledSetting.isADPluginEnabled()) { - throw new IllegalStateException(CommonErrorMessages.DISABLED_ERR_MSG); + if (!ADEnabledSetting.isADEnabled()) { + throw new IllegalStateException(ADCommonMessages.DISABLED_ERR_MSG); } String detectorName = request.param("name", null); @@ -71,16 +71,16 @@ public List replacedRoutes() { // get the count of number of detectors new ReplacedRoute( RestRequest.Method.GET, - String.format(Locale.ROOT, "%s/%s", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, COUNT), + String.format(Locale.ROOT, "%s/%s", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, COUNT), RestRequest.Method.GET, - String.format(Locale.ROOT, "%s/%s", AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, COUNT) + String.format(Locale.ROOT, "%s/%s", TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, COUNT) ), // get if a detector name exists with name new ReplacedRoute( RestRequest.Method.GET, - String.format(Locale.ROOT, "%s/%s", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, MATCH), + String.format(Locale.ROOT, "%s/%s", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, MATCH), RestRequest.Method.GET, - String.format(Locale.ROOT, "%s/%s", AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, MATCH) + String.format(Locale.ROOT, "%s/%s", TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI, MATCH) ) ); } diff --git a/src/main/java/org/opensearch/ad/rest/RestSearchAnomalyResultAction.java b/src/main/java/org/opensearch/ad/rest/RestSearchAnomalyResultAction.java index f8468bd15..9db521595 100644 --- a/src/main/java/org/opensearch/ad/rest/RestSearchAnomalyResultAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestSearchAnomalyResultAction.java @@ -11,9 +11,9 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.indices.AnomalyDetectionIndices.ALL_AD_RESULTS_INDEX_PATTERN; -import static org.opensearch.ad.util.RestHandlerUtils.RESULT_INDEX; -import static org.opensearch.ad.util.RestHandlerUtils.getSourceContext; +import static org.opensearch.ad.indices.ADIndexManagement.ALL_AD_RESULTS_INDEX_PATTERN; +import static org.opensearch.timeseries.util.RestHandlerUtils.RESULT_INDEX; +import static org.opensearch.timeseries.util.RestHandlerUtils.getSourceContext; import java.io.IOException; import java.util.Locale; @@ -21,14 +21,14 @@ import org.apache.commons.lang3.tuple.Pair; import org.apache.logging.log4j.util.Strings; import org.opensearch.action.search.SearchRequest; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.transport.SearchAnomalyResultAction; import org.opensearch.client.node.NodeClient; import org.opensearch.rest.RestRequest; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; import com.google.common.collect.ImmutableList; @@ -36,8 +36,8 @@ * This class consists of the REST handler to search anomaly results. */ public class RestSearchAnomalyResultAction extends AbstractSearchAction { - private static final String LEGACY_URL_PATH = AnomalyDetectorPlugin.LEGACY_OPENDISTRO_AD_BASE_URI + "/results/_search"; - private static final String URL_PATH = AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI + "/results/_search"; + private static final String LEGACY_URL_PATH = TimeSeriesAnalyticsPlugin.LEGACY_OPENDISTRO_AD_BASE_URI + "/results/_search"; + private static final String URL_PATH = TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI + "/results/_search"; public static final String SEARCH_ANOMALY_RESULT_ACTION = "search_anomaly_result"; public RestSearchAnomalyResultAction() { @@ -57,8 +57,8 @@ public String getName() { @Override protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient client) throws IOException { - if (!EnabledSetting.isADPluginEnabled()) { - throw new IllegalStateException(CommonErrorMessages.DISABLED_ERR_MSG); + if (!ADEnabledSetting.isADEnabled()) { + throw new IllegalStateException(ADCommonMessages.DISABLED_ERR_MSG); } // resultIndex could be concrete index or index pattern diff --git a/src/main/java/org/opensearch/ad/rest/RestSearchTopAnomalyResultAction.java b/src/main/java/org/opensearch/ad/rest/RestSearchTopAnomalyResultAction.java index 3beb6bd3f..45b8da8a0 100644 --- a/src/main/java/org/opensearch/ad/rest/RestSearchTopAnomalyResultAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestSearchTopAnomalyResultAction.java @@ -17,17 +17,17 @@ import java.util.List; import java.util.Locale; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.transport.SearchTopAnomalyResultAction; import org.opensearch.ad.transport.SearchTopAnomalyResultRequest; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.node.NodeClient; import org.opensearch.core.xcontent.XContentParser; import org.opensearch.rest.BaseRestHandler; import org.opensearch.rest.RestRequest; import org.opensearch.rest.action.RestToXContentListener; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.util.RestHandlerUtils; import com.google.common.collect.ImmutableList; @@ -40,7 +40,7 @@ public class RestSearchTopAnomalyResultAction extends BaseRestHandler { .format( Locale.ROOT, "%s/{%s}/%s/%s", - AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, + TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, RestHandlerUtils.DETECTOR_ID, RestHandlerUtils.RESULTS, RestHandlerUtils.TOP_ANOMALIES @@ -58,8 +58,8 @@ public String getName() { protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient client) throws IOException { // Throw error if disabled - if (!EnabledSetting.isADPluginEnabled()) { - throw new IllegalStateException(CommonErrorMessages.DISABLED_ERR_MSG); + if (!ADEnabledSetting.isADEnabled()) { + throw new IllegalStateException(ADCommonMessages.DISABLED_ERR_MSG); } // Get the typed request @@ -75,7 +75,7 @@ private SearchTopAnomalyResultRequest getSearchTopAnomalyResultRequest(RestReque if (request.hasParam(RestHandlerUtils.DETECTOR_ID)) { detectorId = request.param(RestHandlerUtils.DETECTOR_ID); } else { - throw new IllegalStateException(CommonErrorMessages.AD_ID_MISSING_MSG); + throw new IllegalStateException(ADCommonMessages.AD_ID_MISSING_MSG); } boolean historical = request.paramAsBoolean("historical", false); XContentParser parser = request.contentParser(); diff --git a/src/main/java/org/opensearch/ad/rest/RestStatsAnomalyDetectorAction.java b/src/main/java/org/opensearch/ad/rest/RestStatsAnomalyDetectorAction.java index 5ad871fc6..65b936e98 100644 --- a/src/main/java/org/opensearch/ad/rest/RestStatsAnomalyDetectorAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestStatsAnomalyDetectorAction.java @@ -11,8 +11,8 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.AnomalyDetectorPlugin.AD_BASE_URI; -import static org.opensearch.ad.AnomalyDetectorPlugin.LEGACY_AD_BASE; +import static org.opensearch.timeseries.TimeSeriesAnalyticsPlugin.AD_BASE_URI; +import static org.opensearch.timeseries.TimeSeriesAnalyticsPlugin.LEGACY_AD_BASE; import java.util.Arrays; import java.util.HashSet; @@ -20,12 +20,11 @@ import java.util.Set; import java.util.TreeSet; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.stats.ADStats; import org.opensearch.ad.transport.ADStatsRequest; import org.opensearch.ad.transport.StatsAnomalyDetectorAction; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.node.NodeClient; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.service.ClusterService; @@ -33,6 +32,7 @@ import org.opensearch.rest.BaseRestHandler; import org.opensearch.rest.RestRequest; import org.opensearch.rest.action.RestToXContentListener; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; import com.google.common.collect.ImmutableList; @@ -64,8 +64,8 @@ public String getName() { @Override protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient client) { - if (!EnabledSetting.isADPluginEnabled()) { - throw new IllegalStateException(CommonErrorMessages.DISABLED_ERR_MSG); + if (!ADEnabledSetting.isADEnabled()) { + throw new IllegalStateException(ADCommonMessages.DISABLED_ERR_MSG); } ADStatsRequest adStatsRequest = getRequest(request); return channel -> client.execute(StatsAnomalyDetectorAction.INSTANCE, adStatsRequest, new RestToXContentListener<>(channel)); diff --git a/src/main/java/org/opensearch/ad/rest/RestValidateAnomalyDetectorAction.java b/src/main/java/org/opensearch/ad/rest/RestValidateAnomalyDetectorAction.java index 5ef1d7be2..e728889f8 100644 --- a/src/main/java/org/opensearch/ad/rest/RestValidateAnomalyDetectorAction.java +++ b/src/main/java/org/opensearch/ad/rest/RestValidateAnomalyDetectorAction.java @@ -11,9 +11,9 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.util.RestHandlerUtils.TYPE; -import static org.opensearch.ad.util.RestHandlerUtils.VALIDATE; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.util.RestHandlerUtils.TYPE; +import static org.opensearch.timeseries.util.RestHandlerUtils.VALIDATE; import java.io.IOException; import java.util.Arrays; @@ -25,13 +25,10 @@ import java.util.stream.Collectors; import org.apache.commons.lang3.StringUtils; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.common.exception.ADValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.DetectorValidationIssue; -import org.opensearch.ad.model.ValidationAspect; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.transport.ValidateAnomalyDetectorAction; import org.opensearch.ad.transport.ValidateAnomalyDetectorRequest; import org.opensearch.ad.transport.ValidateAnomalyDetectorResponse; @@ -45,6 +42,9 @@ import org.opensearch.rest.RestChannel; import org.opensearch.rest.RestRequest; import org.opensearch.rest.action.RestToXContentListener; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.common.exception.ValidationException; +import org.opensearch.timeseries.model.ValidationAspect; import com.google.common.collect.ImmutableList; @@ -75,11 +75,11 @@ public List routes() { .of( new Route( RestRequest.Method.POST, - String.format(Locale.ROOT, "%s/%s", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, VALIDATE) + String.format(Locale.ROOT, "%s/%s", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, VALIDATE) ), new Route( RestRequest.Method.POST, - String.format(Locale.ROOT, "%s/%s/{%s}", AnomalyDetectorPlugin.AD_BASE_DETECTORS_URI, VALIDATE, TYPE) + String.format(Locale.ROOT, "%s/%s/{%s}", TimeSeriesAnalyticsPlugin.AD_BASE_DETECTORS_URI, VALIDATE, TYPE) ) ); } @@ -103,8 +103,8 @@ private Boolean validationTypesAreAccepted(String validationType) { @Override protected BaseRestHandler.RestChannelConsumer prepareRequest(RestRequest request, NodeClient client) throws IOException { - if (!EnabledSetting.isADPluginEnabled()) { - throw new IllegalStateException(CommonErrorMessages.DISABLED_ERR_MSG); + if (!ADEnabledSetting.isADEnabled()) { + throw new IllegalStateException(ADCommonMessages.DISABLED_ERR_MSG); } XContentParser parser = request.contentParser(); ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); @@ -113,7 +113,7 @@ protected BaseRestHandler.RestChannelConsumer prepareRequest(RestRequest request // if type param isn't blank and isn't a part of possible validation types throws exception if (!StringUtils.isBlank(typesStr)) { if (!validationTypesAreAccepted(typesStr)) { - throw new IllegalStateException(CommonErrorMessages.NOT_EXISTENT_VALIDATION_TYPE); + throw new IllegalStateException(ADCommonMessages.NOT_EXISTENT_VALIDATION_TYPE); } } @@ -122,8 +122,8 @@ protected BaseRestHandler.RestChannelConsumer prepareRequest(RestRequest request try { detector = AnomalyDetector.parse(parser); } catch (Exception ex) { - if (ex instanceof ADValidationException) { - ADValidationException ADException = (ADValidationException) ex; + if (ex instanceof ValidationException) { + ValidationException ADException = (ValidationException) ex; DetectorValidationIssue issue = new DetectorValidationIssue( ADException.getAspect(), ADException.getType(), diff --git a/src/main/java/org/opensearch/ad/rest/handler/AbstractAnomalyDetectorActionHandler.java b/src/main/java/org/opensearch/ad/rest/handler/AbstractAnomalyDetectorActionHandler.java index b920e671d..614d47bee 100644 --- a/src/main/java/org/opensearch/ad/rest/handler/AbstractAnomalyDetectorActionHandler.java +++ b/src/main/java/org/opensearch/ad/rest/handler/AbstractAnomalyDetectorActionHandler.java @@ -11,14 +11,13 @@ package org.opensearch.ad.rest.handler; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_FIND_DETECTOR_MSG; import static org.opensearch.ad.model.ADTaskType.HISTORICAL_DETECTOR_TASK_TYPES; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; -import static org.opensearch.ad.util.ParseUtils.listEqualsWithoutConsideringOrder; -import static org.opensearch.ad.util.ParseUtils.parseAggregators; -import static org.opensearch.ad.util.RestHandlerUtils.XCONTENT_WITH_TYPE; -import static org.opensearch.ad.util.RestHandlerUtils.isExceptionCausedByInvalidQuery; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.constant.CommonMessages.FAIL_TO_FIND_CONFIG_MSG; +import static org.opensearch.timeseries.util.ParseUtils.listEqualsWithoutConsideringOrder; +import static org.opensearch.timeseries.util.ParseUtils.parseAggregators; +import static org.opensearch.timeseries.util.RestHandlerUtils.XCONTENT_WITH_TYPE; +import static org.opensearch.timeseries.util.RestHandlerUtils.isExceptionCausedByInvalidQuery; import java.io.IOException; import java.time.Clock; @@ -51,24 +50,13 @@ import org.opensearch.action.support.IndicesOptions; import org.opensearch.action.support.WriteRequest; import org.opensearch.action.support.replication.ReplicationResponse; -import org.opensearch.ad.common.exception.ADValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.feature.SearchFeatureDao; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.DetectorValidationIssueType; -import org.opensearch.ad.model.Feature; -import org.opensearch.ad.model.MergeableList; -import org.opensearch.ad.model.ValidationAspect; import org.opensearch.ad.rest.RestValidateAnomalyDetectorAction; -import org.opensearch.ad.settings.NumericSetting; +import org.opensearch.ad.settings.ADNumericSetting; import org.opensearch.ad.task.ADTaskManager; import org.opensearch.ad.transport.IndexAnomalyDetectorResponse; import org.opensearch.ad.transport.ValidateAnomalyDetectorResponse; -import org.opensearch.ad.util.MultiResponsesDelegateActionListener; -import org.opensearch.ad.util.RestHandlerUtils; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Settings; @@ -87,6 +75,18 @@ import org.opensearch.rest.RestRequest; import org.opensearch.search.aggregations.AggregatorFactories; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.common.exception.ValidationException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.MergeableList; +import org.opensearch.timeseries.model.ValidationAspect; +import org.opensearch.timeseries.model.ValidationIssueType; +import org.opensearch.timeseries.util.MultiResponsesDelegateActionListener; +import org.opensearch.timeseries.util.RestHandlerUtils; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.TransportService; import com.google.common.collect.Sets; @@ -129,7 +129,9 @@ public abstract class AbstractAnomalyDetectorActionHandler DEFAULT_VALIDATION_ASPECTS = Sets.newHashSet(ValidationAspect.DETECTOR); - protected final AnomalyDetectionIndices anomalyDetectionIndices; + public static String INVALID_NAME_SIZE = "Name should be shortened. The maximum limit is " + MAX_DETECTOR_NAME_SIZE + " characters."; + + protected final ADIndexManagement anomalyDetectionIndices; protected final String detectorId; protected final Long seqNo; protected final Long primaryTerm; @@ -191,7 +193,7 @@ public AbstractAnomalyDetectorActionHandler( SecurityClientUtil clientUtil, TransportService transportService, ActionListener listener, - AnomalyDetectionIndices anomalyDetectionIndices, + ADIndexManagement anomalyDetectionIndices, String detectorId, Long seqNo, Long primaryTerm, @@ -247,7 +249,7 @@ public AbstractAnomalyDetectorActionHandler( * mapping. */ public void start() { - String resultIndex = anomalyDetector.getResultIndex(); + String resultIndex = anomalyDetector.getCustomResultIndex(); // use default detector result index which is system index if (resultIndex == null) { createOrUpdateDetector(); @@ -264,11 +266,7 @@ public void start() { logger.error(ex); listener .onFailure( - new ADValidationException( - ex.getMessage(), - DetectorValidationIssueType.RESULT_INDEX, - ValidationAspect.DETECTOR - ) + new ValidationException(ex.getMessage(), ValidationIssueType.RESULT_INDEX, ValidationAspect.DETECTOR) ); return; }) @@ -287,10 +285,10 @@ public void start() { // index won't be created, only validation checks will be executed throughout the class private void createOrUpdateDetector() { try (ThreadContext.StoredContext context = client.threadPool().getThreadContext().stashContext()) { - if (!anomalyDetectionIndices.doesAnomalyDetectorIndexExist() && !this.isDryRun) { + if (!anomalyDetectionIndices.doesConfigIndexExist() && !this.isDryRun) { logger.info("AnomalyDetector Indices do not exist"); anomalyDetectionIndices - .initAnomalyDetectorIndex( + .initConfigIndex( ActionListener .wrap(response -> onCreateMappingsResponse(response, false), exception -> listener.onFailure(exception)) ); @@ -310,26 +308,12 @@ private void createOrUpdateDetector() { // because it was never check on the backend in the past protected void validateDetectorName(boolean indexingDryRun) { if (!anomalyDetector.getName().matches(NAME_REGEX)) { - listener - .onFailure( - new ADValidationException( - CommonErrorMessages.INVALID_DETECTOR_NAME, - DetectorValidationIssueType.NAME, - ValidationAspect.DETECTOR - ) - ); + listener.onFailure(new ValidationException(CommonMessages.INVALID_NAME, ValidationIssueType.NAME, ValidationAspect.DETECTOR)); return; } if (anomalyDetector.getName().length() > MAX_DETECTOR_NAME_SIZE) { - listener - .onFailure( - new ADValidationException( - CommonErrorMessages.INVALID_DETECTOR_NAME_SIZE, - DetectorValidationIssueType.NAME, - ValidationAspect.DETECTOR - ) - ); + listener.onFailure(new ValidationException(INVALID_NAME_SIZE, ValidationIssueType.NAME, ValidationAspect.DETECTOR)); return; } validateTimeField(indexingDryRun); @@ -363,9 +347,9 @@ protected void validateTimeField(boolean indexingDryRun) { if (!typeName.equals(CommonName.DATE_TYPE)) { listener .onFailure( - new ADValidationException( - String.format(Locale.ROOT, CommonErrorMessages.INVALID_TIMESTAMP, givenTimeField), - DetectorValidationIssueType.TIMEFIELD_FIELD, + new ValidationException( + String.format(Locale.ROOT, CommonMessages.INVALID_TIMESTAMP, givenTimeField), + ValidationIssueType.TIMEFIELD_FIELD, ValidationAspect.DETECTOR ) ); @@ -380,9 +364,9 @@ protected void validateTimeField(boolean indexingDryRun) { if (!foundField) { listener .onFailure( - new ADValidationException( - String.format(Locale.ROOT, CommonErrorMessages.NON_EXISTENT_TIMESTAMP, givenTimeField), - DetectorValidationIssueType.TIMEFIELD_FIELD, + new ValidationException( + String.format(Locale.ROOT, CommonMessages.NON_EXISTENT_TIMESTAMP, givenTimeField), + ValidationIssueType.TIMEFIELD_FIELD, ValidationAspect.DETECTOR ) ); @@ -394,7 +378,15 @@ protected void validateTimeField(boolean indexingDryRun) { logger.error(message, error); listener.onFailure(new IllegalArgumentException(message)); }); - clientUtil.executeWithInjectedSecurity(GetFieldMappingsAction.INSTANCE, getMappingsRequest, user, client, mappingsListener); + clientUtil + .executeWithInjectedSecurity( + GetFieldMappingsAction.INSTANCE, + getMappingsRequest, + user, + client, + AnalysisType.AD, + mappingsListener + ); } /** @@ -418,7 +410,7 @@ protected void prepareAnomalyDetectorIndexing(boolean indexingDryRun) { } protected void updateAnomalyDetector(String detectorId, boolean indexingDryRun) { - GetRequest request = new GetRequest(ANOMALY_DETECTORS_INDEX, detectorId); + GetRequest request = new GetRequest(CommonName.CONFIG_INDEX, detectorId); client .get( request, @@ -432,7 +424,7 @@ protected void updateAnomalyDetector(String detectorId, boolean indexingDryRun) private void onGetAnomalyDetectorResponse(GetResponse response, boolean indexingDryRun, String detectorId) { if (!response.isExists()) { - listener.onFailure(new OpenSearchStatusException(FAIL_TO_FIND_DETECTOR_MSG + detectorId, RestStatus.NOT_FOUND)); + listener.onFailure(new OpenSearchStatusException(FAIL_TO_FIND_CONFIG_MSG + detectorId, RestStatus.NOT_FOUND)); return; } try (XContentParser parser = RestHandlerUtils.createXContentParserFromRegistry(xContentRegistry, response.getSourceAsBytesRef())) { @@ -444,13 +436,13 @@ private void onGetAnomalyDetectorResponse(GetResponse response, boolean indexing // If single-category HC changed category field from IP to error type, the AD result page may show both IP and error type // in top N entities list. That's confusing. // So we decide to block updating detector category field. - if (!listEqualsWithoutConsideringOrder(existingDetector.getCategoryField(), anomalyDetector.getCategoryField())) { - listener - .onFailure(new OpenSearchStatusException(CommonErrorMessages.CAN_NOT_CHANGE_CATEGORY_FIELD, RestStatus.BAD_REQUEST)); + if (!listEqualsWithoutConsideringOrder(existingDetector.getCategoryFields(), anomalyDetector.getCategoryFields())) { + listener.onFailure(new OpenSearchStatusException(CommonMessages.CAN_NOT_CHANGE_CATEGORY_FIELD, RestStatus.BAD_REQUEST)); return; } - if (!Objects.equals(existingDetector.getResultIndex(), anomalyDetector.getResultIndex())) { - listener.onFailure(new OpenSearchStatusException(CommonErrorMessages.CAN_NOT_CHANGE_RESULT_INDEX, RestStatus.BAD_REQUEST)); + if (!Objects.equals(existingDetector.getCustomResultIndex(), anomalyDetector.getCustomResultIndex())) { + listener + .onFailure(new OpenSearchStatusException(CommonMessages.CAN_NOT_CHANGE_CUSTOM_RESULT_INDEX, RestStatus.BAD_REQUEST)); return; } @@ -479,16 +471,16 @@ protected void validateExistingDetector(AnomalyDetector existingDetector, boolea } protected boolean hasCategoryField(AnomalyDetector detector) { - return detector.getCategoryField() != null && !detector.getCategoryField().isEmpty(); + return detector.getCategoryFields() != null && !detector.getCategoryFields().isEmpty(); } protected void validateAgainstExistingMultiEntityAnomalyDetector(String detectorId, boolean indexingDryRun) { - if (anomalyDetectionIndices.doesAnomalyDetectorIndexExist()) { + if (anomalyDetectionIndices.doesConfigIndexExist()) { QueryBuilder query = QueryBuilders.boolQuery().filter(QueryBuilders.existsQuery(AnomalyDetector.CATEGORY_FIELD)); SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder().query(query).size(0).timeout(requestTimeout); - SearchRequest searchRequest = new SearchRequest(ANOMALY_DETECTORS_INDEX).source(searchSourceBuilder); + SearchRequest searchRequest = new SearchRequest(CommonName.CONFIG_INDEX).source(searchSourceBuilder); client .search( searchRequest, @@ -506,15 +498,15 @@ protected void validateAgainstExistingMultiEntityAnomalyDetector(String detector protected void createAnomalyDetector(boolean indexingDryRun) { try { - List categoricalFields = anomalyDetector.getCategoryField(); + List categoricalFields = anomalyDetector.getCategoryFields(); if (categoricalFields != null && categoricalFields.size() > 0) { validateAgainstExistingMultiEntityAnomalyDetector(null, indexingDryRun); } else { - if (anomalyDetectionIndices.doesAnomalyDetectorIndexExist()) { + if (anomalyDetectionIndices.doesConfigIndexExist()) { QueryBuilder query = QueryBuilders.matchAllQuery(); SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder().query(query).size(0).timeout(requestTimeout); - SearchRequest searchRequest = new SearchRequest(ANOMALY_DETECTORS_INDEX).source(searchSourceBuilder); + SearchRequest searchRequest = new SearchRequest(CommonName.CONFIG_INDEX).source(searchSourceBuilder); client .search( @@ -543,11 +535,7 @@ protected void onSearchSingleEntityAdResponse(SearchResponse response, boolean i if (indexingDryRun) { listener .onFailure( - new ADValidationException( - errorMsgSingleEntity, - DetectorValidationIssueType.GENERAL_SETTINGS, - ValidationAspect.DETECTOR - ) + new ValidationException(errorMsgSingleEntity, ValidationIssueType.GENERAL_SETTINGS, ValidationAspect.DETECTOR) ); return; } @@ -562,10 +550,7 @@ protected void onSearchMultiEntityAdResponse(SearchResponse response, String det String errorMsg = String.format(Locale.ROOT, EXCEEDED_MAX_MULTI_ENTITY_DETECTORS_PREFIX_MSG, maxMultiEntityAnomalyDetectors); logger.error(errorMsg); if (indexingDryRun) { - listener - .onFailure( - new ADValidationException(errorMsg, DetectorValidationIssueType.GENERAL_SETTINGS, ValidationAspect.DETECTOR) - ); + listener.onFailure(new ValidationException(errorMsg, ValidationIssueType.GENERAL_SETTINGS, ValidationAspect.DETECTOR)); return; } listener.onFailure(new IllegalArgumentException(errorMsg)); @@ -576,7 +561,7 @@ protected void onSearchMultiEntityAdResponse(SearchResponse response, String det @SuppressWarnings("unchecked") protected void validateCategoricalField(String detectorId, boolean indexingDryRun) { - List categoryField = anomalyDetector.getCategoryField(); + List categoryField = anomalyDetector.getCategoryFields(); if (categoryField == null) { searchAdInputIndices(detectorId, indexingDryRun); @@ -586,13 +571,13 @@ protected void validateCategoricalField(String detectorId, boolean indexingDryRu // we only support a certain number of categorical field // If there is more fields than required, AnomalyDetector's constructor // throws ADValidationException before reaching this line - int maxCategoryFields = NumericSetting.maxCategoricalFields(); + int maxCategoryFields = ADNumericSetting.maxCategoricalFields(); if (categoryField.size() > maxCategoryFields) { listener .onFailure( - new ADValidationException( - CommonErrorMessages.getTooManyCategoricalFieldErr(maxCategoryFields), - DetectorValidationIssueType.CATEGORY, + new ValidationException( + CommonMessages.getTooManyCategoricalFieldErr(maxCategoryFields), + ValidationIssueType.CATEGORY, ValidationAspect.DETECTOR ) ); @@ -639,9 +624,9 @@ protected void validateCategoricalField(String detectorId, boolean indexingDryRu if (!typeName.equals(CommonName.KEYWORD_TYPE) && !typeName.equals(CommonName.IP_TYPE)) { listener .onFailure( - new ADValidationException( + new ValidationException( CATEGORICAL_FIELD_TYPE_ERR_MSG, - DetectorValidationIssueType.CATEGORY, + ValidationIssueType.CATEGORY, ValidationAspect.DETECTOR ) ); @@ -658,9 +643,9 @@ protected void validateCategoricalField(String detectorId, boolean indexingDryRu if (foundField == false) { listener .onFailure( - new ADValidationException( + new ValidationException( String.format(Locale.ROOT, CATEGORY_NOT_FOUND_ERR_MSG, categoryField0), - DetectorValidationIssueType.CATEGORY, + ValidationIssueType.CATEGORY, ValidationAspect.DETECTOR ) ); @@ -674,7 +659,15 @@ protected void validateCategoricalField(String detectorId, boolean indexingDryRu listener.onFailure(new IllegalArgumentException(message)); }); - clientUtil.executeWithInjectedSecurity(GetFieldMappingsAction.INSTANCE, getMappingsRequest, user, client, mappingsListener); + clientUtil + .executeWithInjectedSecurity( + GetFieldMappingsAction.INSTANCE, + getMappingsRequest, + user, + client, + AnalysisType.AD, + mappingsListener + ); } protected void searchAdInputIndices(String detectorId, boolean indexingDryRun) { @@ -691,7 +684,7 @@ protected void searchAdInputIndices(String detectorId, boolean indexingDryRun) { exception -> listener.onFailure(exception) ); - clientUtil.asyncRequestWithInjectedSecurity(searchRequest, client::search, user, client, searchResponseListener); + clientUtil.asyncRequestWithInjectedSecurity(searchRequest, client::search, user, client, AnalysisType.AD, searchResponseListener); } protected void onSearchAdInputIndicesResponse(SearchResponse response, String detectorId, boolean indexingDryRun) throws IOException { @@ -699,7 +692,7 @@ protected void onSearchAdInputIndicesResponse(SearchResponse response, String de String errorMsg = NO_DOCS_IN_USER_INDEX_MSG + Arrays.toString(anomalyDetector.getIndices().toArray(new String[0])); logger.error(errorMsg); if (indexingDryRun) { - listener.onFailure(new ADValidationException(errorMsg, DetectorValidationIssueType.INDICES, ValidationAspect.DETECTOR)); + listener.onFailure(new ValidationException(errorMsg, ValidationIssueType.INDICES, ValidationAspect.DETECTOR)); return; } listener.onFailure(new IllegalArgumentException(errorMsg)); @@ -709,7 +702,7 @@ protected void onSearchAdInputIndicesResponse(SearchResponse response, String de } protected void checkADNameExists(String detectorId, boolean indexingDryRun) throws IOException { - if (anomalyDetectionIndices.doesAnomalyDetectorIndexExist()) { + if (anomalyDetectionIndices.doesConfigIndexExist()) { BoolQueryBuilder boolQueryBuilder = new BoolQueryBuilder(); // src/main/resources/mappings/anomaly-detectors.json#L14 boolQueryBuilder.must(QueryBuilders.termQuery("name.keyword", anomalyDetector.getName())); @@ -717,7 +710,7 @@ protected void checkADNameExists(String detectorId, boolean indexingDryRun) thro boolQueryBuilder.mustNot(QueryBuilders.termQuery(RestHandlerUtils._ID, detectorId)); } SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder().query(boolQueryBuilder).timeout(requestTimeout); - SearchRequest searchRequest = new SearchRequest(ANOMALY_DETECTORS_INDEX).source(searchSourceBuilder); + SearchRequest searchRequest = new SearchRequest(CommonName.CONFIG_INDEX).source(searchSourceBuilder); client .search( searchRequest, @@ -744,7 +737,7 @@ protected void onSearchADNameResponse(SearchResponse response, String detectorId Arrays.stream(response.getHits().getHits()).map(hit -> hit.getId()).collect(Collectors.toList()) ); logger.warn(errorMsg); - listener.onFailure(new ADValidationException(errorMsg, DetectorValidationIssueType.NAME, ValidationAspect.DETECTOR)); + listener.onFailure(new ValidationException(errorMsg, ValidationIssueType.NAME, ValidationAspect.DETECTOR)); } else { tryIndexingAnomalyDetector(indexingDryRun); } @@ -794,7 +787,7 @@ protected void finishDetectorValidationOrContinueToModelValidation() { @SuppressWarnings("unchecked") protected void indexAnomalyDetector(String detectorId) throws IOException { AnomalyDetector detector = new AnomalyDetector( - anomalyDetector.getDetectorId(), + anomalyDetector.getId(), anomalyDetector.getVersion(), anomalyDetector.getName(), anomalyDetector.getDescription(), @@ -802,17 +795,18 @@ protected void indexAnomalyDetector(String detectorId) throws IOException { anomalyDetector.getIndices(), anomalyDetector.getFeatureAttributes(), anomalyDetector.getFilterQuery(), - anomalyDetector.getDetectionInterval(), + anomalyDetector.getInterval(), anomalyDetector.getWindowDelay(), anomalyDetector.getShingleSize(), anomalyDetector.getUiMetadata(), anomalyDetector.getSchemaVersion(), Instant.now(), - anomalyDetector.getCategoryField(), + anomalyDetector.getCategoryFields(), user, - anomalyDetector.getResultIndex() + anomalyDetector.getCustomResultIndex(), + anomalyDetector.getImputationOption() ); - IndexRequest indexRequest = new IndexRequest(ANOMALY_DETECTORS_INDEX) + IndexRequest indexRequest = new IndexRequest(CommonName.CONFIG_INDEX) .setRefreshPolicy(refreshPolicy) .source(detector.toXContent(XContentFactory.jsonBuilder(), XCONTENT_WITH_TYPE)) .setIfSeqNo(seqNo) @@ -860,14 +854,14 @@ public void onFailure(Exception e) { protected void onCreateMappingsResponse(CreateIndexResponse response, boolean indexingDryRun) throws IOException { if (response.isAcknowledged()) { - logger.info("Created {} with mappings.", ANOMALY_DETECTORS_INDEX); + logger.info("Created {} with mappings.", CommonName.CONFIG_INDEX); prepareAnomalyDetectorIndexing(indexingDryRun); } else { - logger.warn("Created {} with mappings call not acknowledged.", ANOMALY_DETECTORS_INDEX); + logger.warn("Created {} with mappings call not acknowledged.", CommonName.CONFIG_INDEX); listener .onFailure( new OpenSearchStatusException( - "Created " + ANOMALY_DETECTORS_INDEX + "with mappings call not acknowledged.", + "Created " + CommonName.CONFIG_INDEX + "with mappings call not acknowledged.", RestStatus.INTERNAL_SERVER_ERROR ) ); @@ -900,11 +894,10 @@ protected void validateAnomalyDetectorFeatures(String detectorId, boolean indexi return; } // checking configuration/syntax error of detector features - String error = RestHandlerUtils.checkAnomalyDetectorFeaturesSyntax(anomalyDetector, maxAnomalyFeatures); + String error = RestHandlerUtils.checkFeaturesSyntax(anomalyDetector, maxAnomalyFeatures); if (StringUtils.isNotBlank(error)) { if (indexingDryRun) { - listener - .onFailure(new ADValidationException(error, DetectorValidationIssueType.FEATURE_ATTRIBUTES, ValidationAspect.DETECTOR)); + listener.onFailure(new ValidationException(error, ValidationIssueType.FEATURE_ATTRIBUTES, ValidationAspect.DETECTOR)); return; } listener.onFailure(new OpenSearchStatusException(error, RestStatus.BAD_REQUEST)); @@ -916,18 +909,14 @@ protected void validateAnomalyDetectorFeatures(String detectorId, boolean indexi }, exception -> { listener .onFailure( - new ADValidationException( - exception.getMessage(), - DetectorValidationIssueType.FEATURE_ATTRIBUTES, - ValidationAspect.DETECTOR - ) + new ValidationException(exception.getMessage(), ValidationIssueType.FEATURE_ATTRIBUTES, ValidationAspect.DETECTOR) ); }); MultiResponsesDelegateActionListener>> multiFeatureQueriesResponseListener = new MultiResponsesDelegateActionListener>>( validateFeatureQueriesListener, anomalyDetector.getFeatureAttributes().size(), - String.format(Locale.ROOT, CommonErrorMessages.VALIDATION_FEATURE_FAILURE, anomalyDetector.getName()), + String.format(Locale.ROOT, "Validation failed for feature(s) of detector %s", anomalyDetector.getName()), false ); @@ -948,21 +937,22 @@ protected void validateAnomalyDetectorFeatures(String detectorId, boolean indexi new MergeableList>(new ArrayList>(Arrays.asList(aggFeatureResult))) ); } else { - String errorMessage = CommonErrorMessages.FEATURE_WITH_EMPTY_DATA_MSG + feature.getName(); + String errorMessage = CommonMessages.FEATURE_WITH_EMPTY_DATA_MSG + feature.getName(); logger.error(errorMessage); multiFeatureQueriesResponseListener.onFailure(new OpenSearchStatusException(errorMessage, RestStatus.BAD_REQUEST)); } }, e -> { String errorMessage; if (isExceptionCausedByInvalidQuery(e)) { - errorMessage = CommonErrorMessages.FEATURE_WITH_INVALID_QUERY_MSG + feature.getName(); + errorMessage = CommonMessages.FEATURE_WITH_INVALID_QUERY_MSG + feature.getName(); } else { - errorMessage = CommonErrorMessages.UNKNOWN_SEARCH_QUERY_EXCEPTION_MSG + feature.getName(); + errorMessage = CommonMessages.UNKNOWN_SEARCH_QUERY_EXCEPTION_MSG + feature.getName(); } logger.error(errorMessage, e); multiFeatureQueriesResponseListener.onFailure(new OpenSearchStatusException(errorMessage, RestStatus.BAD_REQUEST, e)); }); - clientUtil.asyncRequestWithInjectedSecurity(searchRequest, client::search, user, client, searchResponseListener); + clientUtil + .asyncRequestWithInjectedSecurity(searchRequest, client::search, user, client, AnalysisType.AD, searchResponseListener); } } } diff --git a/src/main/java/org/opensearch/ad/rest/handler/AnomalyDetectorActionHandler.java b/src/main/java/org/opensearch/ad/rest/handler/AnomalyDetectorActionHandler.java index e87567ac8..28e68d0fb 100644 --- a/src/main/java/org/opensearch/ad/rest/handler/AnomalyDetectorActionHandler.java +++ b/src/main/java/org/opensearch/ad/rest/handler/AnomalyDetectorActionHandler.java @@ -11,7 +11,6 @@ package org.opensearch.ad.rest.handler; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; import java.io.IOException; @@ -21,14 +20,16 @@ import org.opensearch.OpenSearchStatusException; import org.opensearch.action.get.GetRequest; import org.opensearch.action.get.GetResponse; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.core.action.ActionListener; import org.opensearch.core.rest.RestStatus; import org.opensearch.core.xcontent.NamedXContentRegistry; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.util.RestHandlerUtils; /** * Common handler to process AD request. @@ -53,11 +54,11 @@ public void getDetectorJob( Client client, String detectorId, ActionListener listener, - AnomalyDetectorFunction function, + ExecutorFunction function, NamedXContentRegistry xContentRegistry ) { - if (clusterService.state().metadata().indices().containsKey(ANOMALY_DETECTOR_JOB_INDEX)) { - GetRequest request = new GetRequest(ANOMALY_DETECTOR_JOB_INDEX).id(detectorId); + if (clusterService.state().metadata().indices().containsKey(CommonName.JOB_INDEX)) { + GetRequest request = new GetRequest(CommonName.JOB_INDEX).id(detectorId); client .get( request, @@ -75,7 +76,7 @@ public void getDetectorJob( private void onGetAdJobResponseForWrite( GetResponse response, ActionListener listener, - AnomalyDetectorFunction function, + ExecutorFunction function, NamedXContentRegistry xContentRegistry ) { if (response.isExists()) { @@ -87,7 +88,7 @@ private void onGetAdJobResponseForWrite( .createXContentParserFromRegistry(xContentRegistry, response.getSourceAsBytesRef()) ) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - AnomalyDetectorJob adJob = AnomalyDetectorJob.parse(parser); + Job adJob = Job.parse(parser); if (adJob.isEnabled()) { listener.onFailure(new OpenSearchStatusException("Detector job is running: " + adJobId, RestStatus.BAD_REQUEST)); return; diff --git a/src/main/java/org/opensearch/ad/rest/handler/IndexAnomalyDetectorActionHandler.java b/src/main/java/org/opensearch/ad/rest/handler/IndexAnomalyDetectorActionHandler.java index cec58520c..bed6a7998 100644 --- a/src/main/java/org/opensearch/ad/rest/handler/IndexAnomalyDetectorActionHandler.java +++ b/src/main/java/org/opensearch/ad/rest/handler/IndexAnomalyDetectorActionHandler.java @@ -12,12 +12,10 @@ package org.opensearch.ad.rest.handler; import org.opensearch.action.support.WriteRequest; -import org.opensearch.ad.feature.SearchFeatureDao; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.task.ADTaskManager; import org.opensearch.ad.transport.IndexAnomalyDetectorResponse; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Settings; @@ -26,6 +24,8 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.core.xcontent.NamedXContentRegistry; import org.opensearch.rest.RestRequest; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.TransportService; /** @@ -66,7 +66,7 @@ public IndexAnomalyDetectorActionHandler( SecurityClientUtil clientUtil, TransportService transportService, ActionListener listener, - AnomalyDetectionIndices anomalyDetectionIndices, + ADIndexManagement anomalyDetectionIndices, String detectorId, Long seqNo, Long primaryTerm, diff --git a/src/main/java/org/opensearch/ad/rest/handler/IndexAnomalyDetectorJobActionHandler.java b/src/main/java/org/opensearch/ad/rest/handler/IndexAnomalyDetectorJobActionHandler.java index 47c9b2dbc..5a3b19f24 100644 --- a/src/main/java/org/opensearch/ad/rest/handler/IndexAnomalyDetectorJobActionHandler.java +++ b/src/main/java/org/opensearch/ad/rest/handler/IndexAnomalyDetectorJobActionHandler.java @@ -13,10 +13,9 @@ import static org.opensearch.action.DocWriteResponse.Result.CREATED; import static org.opensearch.action.DocWriteResponse.Result.UPDATED; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; -import static org.opensearch.ad.util.ExceptionUtil.getShardsFailure; -import static org.opensearch.ad.util.RestHandlerUtils.createXContentParserFromRegistry; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.util.ExceptionUtil.getShardsFailure; +import static org.opensearch.timeseries.util.RestHandlerUtils.createXContentParserFromRegistry; import java.io.IOException; import java.time.Duration; @@ -31,19 +30,14 @@ import org.opensearch.action.index.IndexResponse; import org.opensearch.action.support.WriteRequest; import org.opensearch.ad.ExecuteADResultResponseRecorder; -import org.opensearch.ad.indices.AnomalyDetectionIndices; -import org.opensearch.ad.model.ADTaskState; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.ad.task.ADTaskManager; -import org.opensearch.ad.transport.AnomalyDetectorJobResponse; import org.opensearch.ad.transport.AnomalyResultAction; import org.opensearch.ad.transport.AnomalyResultRequest; import org.opensearch.ad.transport.StopDetectorAction; import org.opensearch.ad.transport.StopDetectorRequest; import org.opensearch.ad.transport.StopDetectorResponse; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Client; import org.opensearch.common.unit.TimeValue; import org.opensearch.common.xcontent.XContentFactory; @@ -53,6 +47,13 @@ import org.opensearch.core.xcontent.XContentParser; import org.opensearch.jobscheduler.spi.schedule.IntervalSchedule; import org.opensearch.jobscheduler.spi.schedule.Schedule; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.model.TaskState; +import org.opensearch.timeseries.transport.JobResponse; +import org.opensearch.timeseries.util.RestHandlerUtils; import org.opensearch.transport.TransportService; import com.google.common.base.Throwables; @@ -62,7 +63,7 @@ */ public class IndexAnomalyDetectorJobActionHandler { - private final AnomalyDetectionIndices anomalyDetectionIndices; + private final ADIndexManagement anomalyDetectionIndices; private final String detectorId; private final Long seqNo; private final Long primaryTerm; @@ -91,7 +92,7 @@ public class IndexAnomalyDetectorJobActionHandler { */ public IndexAnomalyDetectorJobActionHandler( Client client, - AnomalyDetectionIndices anomalyDetectionIndices, + ADIndexManagement anomalyDetectionIndices, String detectorId, Long seqNo, Long primaryTerm, @@ -120,16 +121,16 @@ public IndexAnomalyDetectorJobActionHandler( * @param detector anomaly detector * @param listener Listener to send responses */ - public void startAnomalyDetectorJob(AnomalyDetector detector, ActionListener listener) { + public void startAnomalyDetectorJob(AnomalyDetector detector, ActionListener listener) { // this start listener is created & injected throughout the job handler so that whenever the job response is received, // there's the extra step of trying to index results and update detector state with a 60s delay. - ActionListener startListener = ActionListener.wrap(r -> { + ActionListener startListener = ActionListener.wrap(r -> { try { Instant executionEndTime = Instant.now(); - IntervalTimeConfiguration schedule = (IntervalTimeConfiguration) detector.getDetectionInterval(); + IntervalTimeConfiguration schedule = (IntervalTimeConfiguration) detector.getInterval(); Instant executionStartTime = executionEndTime.minus(schedule.getInterval(), schedule.getUnit()); AnomalyResultRequest getRequest = new AnomalyResultRequest( - detector.getDetectorId(), + detector.getId(), executionStartTime.toEpochMilli(), executionEndTime.toEpochMilli() ); @@ -160,17 +161,17 @@ public void startAnomalyDetectorJob(AnomalyDetector detector, ActionListener { + if (!anomalyDetectionIndices.doesJobIndexExist()) { + anomalyDetectionIndices.initJobIndex(ActionListener.wrap(response -> { if (response.isAcknowledged()) { - logger.info("Created {} with mappings.", ANOMALY_DETECTORS_INDEX); + logger.info("Created {} with mappings.", CommonName.CONFIG_INDEX); createJob(detector, startListener); } else { - logger.warn("Created {} with mappings call not acknowledged.", ANOMALY_DETECTORS_INDEX); + logger.warn("Created {} with mappings call not acknowledged.", CommonName.CONFIG_INDEX); startListener .onFailure( new OpenSearchStatusException( - "Created " + ANOMALY_DETECTORS_INDEX + " with mappings call not acknowledged.", + "Created " + CommonName.CONFIG_INDEX + " with mappings call not acknowledged.", RestStatus.INTERNAL_SERVER_ERROR ) ); @@ -181,14 +182,14 @@ public void startAnomalyDetectorJob(AnomalyDetector detector, ActionListener listener) { + private void createJob(AnomalyDetector detector, ActionListener listener) { try { - IntervalTimeConfiguration interval = (IntervalTimeConfiguration) detector.getDetectionInterval(); + IntervalTimeConfiguration interval = (IntervalTimeConfiguration) detector.getInterval(); Schedule schedule = new IntervalSchedule(Instant.now(), (int) interval.getInterval(), interval.getUnit()); Duration duration = Duration.of(interval.getInterval(), interval.getUnit()); - AnomalyDetectorJob job = new AnomalyDetectorJob( - detector.getDetectorId(), + Job job = new Job( + detector.getId(), schedule, detector.getWindowDelay(), true, @@ -197,10 +198,10 @@ private void createJob(AnomalyDetector detector, ActionListener listener - ) { - GetRequest getRequest = new GetRequest(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX).id(detectorId); + private void getJobForWrite(AnomalyDetector detector, Job job, ActionListener listener) { + GetRequest getRequest = new GetRequest(CommonName.JOB_INDEX).id(detectorId); client .get( @@ -229,19 +226,19 @@ private void getAnomalyDetectorJobForWrite( private void onGetAnomalyDetectorJobForWrite( GetResponse response, AnomalyDetector detector, - AnomalyDetectorJob job, - ActionListener listener + Job job, + ActionListener listener ) throws IOException { if (response.isExists()) { try (XContentParser parser = createXContentParserFromRegistry(xContentRegistry, response.getSourceAsBytesRef())) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - AnomalyDetectorJob currentAdJob = AnomalyDetectorJob.parse(parser); + Job currentAdJob = Job.parse(parser); if (currentAdJob.isEnabled()) { listener .onFailure(new OpenSearchStatusException("Anomaly detector job is already running: " + detectorId, RestStatus.OK)); return; } else { - AnomalyDetectorJob newJob = new AnomalyDetectorJob( + Job newJob = new Job( job.getName(), job.getSchedule(), job.getWindowDelay(), @@ -251,7 +248,7 @@ private void onGetAnomalyDetectorJobForWrite( Instant.now(), job.getLockDurationSeconds(), job.getUser(), - job.getResultIndex() + job.getCustomResultIndex() ); // Get latest realtime task and check its state before index job. Will reset running realtime task // as STOPPED first if job disabled, then start new job and create new realtime task. @@ -274,12 +271,8 @@ private void onGetAnomalyDetectorJobForWrite( } } - private void indexAnomalyDetectorJob( - AnomalyDetectorJob job, - AnomalyDetectorFunction function, - ActionListener listener - ) throws IOException { - IndexRequest indexRequest = new IndexRequest(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX) + private void indexAnomalyDetectorJob(Job job, ExecutorFunction function, ActionListener listener) throws IOException { + IndexRequest indexRequest = new IndexRequest(CommonName.JOB_INDEX) .setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE) .source(job.toXContent(XContentFactory.jsonBuilder(), RestHandlerUtils.XCONTENT_WITH_TYPE)) .setIfSeqNo(seqNo) @@ -299,8 +292,8 @@ private void indexAnomalyDetectorJob( private void onIndexAnomalyDetectorJobResponse( IndexResponse response, - AnomalyDetectorFunction function, - ActionListener listener + ExecutorFunction function, + ActionListener listener ) { if (response == null || (response.getResult() != CREATED && response.getResult() != UPDATED)) { String errorMsg = getShardsFailure(response); @@ -310,13 +303,7 @@ private void onIndexAnomalyDetectorJobResponse( if (function != null) { function.execute(); } else { - AnomalyDetectorJobResponse anomalyDetectorJobResponse = new AnomalyDetectorJobResponse( - response.getId(), - response.getVersion(), - response.getSeqNo(), - response.getPrimaryTerm(), - RestStatus.OK - ); + JobResponse anomalyDetectorJobResponse = new JobResponse(response.getId()); listener.onResponse(anomalyDetectorJobResponse); } } @@ -329,18 +316,18 @@ private void onIndexAnomalyDetectorJobResponse( * @param detectorId detector identifier * @param listener Listener to send responses */ - public void stopAnomalyDetectorJob(String detectorId, ActionListener listener) { - GetRequest getRequest = new GetRequest(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX).id(detectorId); + public void stopAnomalyDetectorJob(String detectorId, ActionListener listener) { + GetRequest getRequest = new GetRequest(CommonName.JOB_INDEX).id(detectorId); client.get(getRequest, ActionListener.wrap(response -> { if (response.isExists()) { try (XContentParser parser = createXContentParserFromRegistry(xContentRegistry, response.getSourceAsBytesRef())) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - AnomalyDetectorJob job = AnomalyDetectorJob.parse(parser); + Job job = Job.parse(parser); if (!job.isEnabled()) { - adTaskManager.stopLatestRealtimeTask(detectorId, ADTaskState.STOPPED, null, transportService, listener); + adTaskManager.stopLatestRealtimeTask(detectorId, TaskState.STOPPED, null, transportService, listener); } else { - AnomalyDetectorJob newJob = new AnomalyDetectorJob( + Job newJob = new Job( job.getName(), job.getSchedule(), job.getWindowDelay(), @@ -350,7 +337,7 @@ public void stopAnomalyDetectorJob(String detectorId, ActionListener listener.onFailure(exception))); } - private ActionListener stopAdDetectorListener( - String detectorId, - ActionListener listener - ) { + private ActionListener stopAdDetectorListener(String detectorId, ActionListener listener) { return new ActionListener() { @Override public void onResponse(StopDetectorResponse stopDetectorResponse) { @@ -385,14 +369,14 @@ public void onResponse(StopDetectorResponse stopDetectorResponse) { logger.info("AD model deleted successfully for detector {}", detectorId); // StopDetectorTransportAction will send out DeleteModelAction which will clear all realtime cache. // Pass null transport service to method "stopLatestRealtimeTask" to not re-clear coordinating node cache. - adTaskManager.stopLatestRealtimeTask(detectorId, ADTaskState.STOPPED, null, null, listener); + adTaskManager.stopLatestRealtimeTask(detectorId, TaskState.STOPPED, null, null, listener); } else { logger.error("Failed to delete AD model for detector {}", detectorId); // If failed to clear all realtime cache, will try to re-clear coordinating node cache. adTaskManager .stopLatestRealtimeTask( detectorId, - ADTaskState.FAILED, + TaskState.FAILED, new OpenSearchStatusException("Failed to delete AD model", RestStatus.INTERNAL_SERVER_ERROR), transportService, listener @@ -407,7 +391,7 @@ public void onFailure(Exception e) { adTaskManager .stopLatestRealtimeTask( detectorId, - ADTaskState.FAILED, + TaskState.FAILED, new OpenSearchStatusException("Failed to execute stop detector action", RestStatus.INTERNAL_SERVER_ERROR), transportService, listener diff --git a/src/main/java/org/opensearch/ad/rest/handler/ModelValidationActionHandler.java b/src/main/java/org/opensearch/ad/rest/handler/ModelValidationActionHandler.java index 4cd77942d..f37a10580 100644 --- a/src/main/java/org/opensearch/ad/rest/handler/ModelValidationActionHandler.java +++ b/src/main/java/org/opensearch/ad/rest/handler/ModelValidationActionHandler.java @@ -33,22 +33,9 @@ import org.opensearch.OpenSearchStatusException; import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; -import org.opensearch.ad.common.exception.ADValidationException; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.feature.SearchFeatureDao; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.DetectorValidationIssueType; -import org.opensearch.ad.model.Feature; -import org.opensearch.ad.model.IntervalTimeConfiguration; -import org.opensearch.ad.model.MergeableList; -import org.opensearch.ad.model.TimeConfiguration; -import org.opensearch.ad.model.ValidationAspect; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.transport.ValidateAnomalyDetectorResponse; -import org.opensearch.ad.util.MultiResponsesDelegateActionListener; -import org.opensearch.ad.util.ParseUtils; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Settings; @@ -76,6 +63,20 @@ import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.search.sort.FieldSortBuilder; import org.opensearch.search.sort.SortOrder; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.ValidationException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.MergeableList; +import org.opensearch.timeseries.model.TimeConfiguration; +import org.opensearch.timeseries.model.ValidationAspect; +import org.opensearch.timeseries.model.ValidationIssueType; +import org.opensearch.timeseries.util.MultiResponsesDelegateActionListener; +import org.opensearch.timeseries.util.ParseUtils; +import org.opensearch.timeseries.util.SecurityClientUtil; /** *

This class executes all validation checks that are not blocking on the 'model' level. @@ -157,7 +158,7 @@ public void checkIfMultiEntityDetector() { listener.onFailure(exception); logger.error("Failed to get top entity for categorical field", exception); }); - if (anomalyDetector.isMultientityDetector()) { + if (anomalyDetector.isHighCardinality()) { getTopEntity(recommendationListener); } else { recommendationListener.onResponse(Collections.emptyMap()); @@ -170,7 +171,7 @@ public void checkIfMultiEntityDetector() { // with the highest doc count. private void getTopEntity(ActionListener> topEntityListener) { // Look at data back to the lower bound given the max interval we recommend or one given - long maxIntervalInMinutes = Math.max(MAX_INTERVAL_REC_LENGTH_IN_MINUTES, anomalyDetector.getDetectorIntervalInMinutes()); + long maxIntervalInMinutes = Math.max(MAX_INTERVAL_REC_LENGTH_IN_MINUTES, anomalyDetector.getIntervalInMinutes()); LongBounds timeRangeBounds = getTimeRangeBounds( Instant.now().toEpochMilli(), new IntervalTimeConfiguration(maxIntervalInMinutes, ChronoUnit.MINUTES) @@ -180,17 +181,17 @@ private void getTopEntity(ActionListener> topEntityListener) .to(timeRangeBounds.getMax()); AggregationBuilder bucketAggs; Map topKeys = new HashMap<>(); - if (anomalyDetector.getCategoryField().size() == 1) { + if (anomalyDetector.getCategoryFields().size() == 1) { bucketAggs = AggregationBuilders .terms(AGG_NAME_TOP) - .field(anomalyDetector.getCategoryField().get(0)) + .field(anomalyDetector.getCategoryFields().get(0)) .order(BucketOrder.count(true)); } else { bucketAggs = AggregationBuilders .composite( AGG_NAME_TOP, anomalyDetector - .getCategoryField() + .getCategoryFields() .stream() .map(f -> new TermsValuesSourceBuilder(f).field(f)) .collect(Collectors.toList()) @@ -216,7 +217,7 @@ private void getTopEntity(ActionListener> topEntityListener) topEntityListener.onResponse(Collections.emptyMap()); return; } - if (anomalyDetector.getCategoryField().size() == 1) { + if (anomalyDetector.getCategoryFields().size() == 1) { Terms entities = aggs.get(AGG_NAME_TOP); Object key = entities .getBuckets() @@ -224,7 +225,7 @@ private void getTopEntity(ActionListener> topEntityListener) .max(Comparator.comparingInt(entry -> (int) entry.getDocCount())) .map(MultiBucketsAggregation.Bucket::getKeyAsString) .orElse(null); - topKeys.put(anomalyDetector.getCategoryField().get(0), key); + topKeys.put(anomalyDetector.getCategoryFields().get(0), key); } else { CompositeAggregation compositeAgg = aggs.get(AGG_NAME_TOP); topKeys @@ -252,6 +253,7 @@ private void getTopEntity(ActionListener> topEntityListener) client::search, user, client, + AnalysisType.AD, searchResponseListener ); } @@ -274,9 +276,9 @@ private void getSampleRangesForValidationChecks( if (!latestTime.isPresent() || latestTime.get() <= 0) { listener .onFailure( - new ADValidationException( - CommonErrorMessages.TIME_FIELD_NOT_ENOUGH_HISTORICAL_DATA, - DetectorValidationIssueType.TIMEFIELD_FIELD, + new ValidationException( + CommonMessages.TIME_FIELD_NOT_ENOUGH_HISTORICAL_DATA, + ValidationIssueType.TIMEFIELD_FIELD, ValidationAspect.MODEL ) ); @@ -286,7 +288,7 @@ private void getSampleRangesForValidationChecks( try { getBucketAggregates(timeRangeEnd, listener, topEntity); } catch (IOException e) { - listener.onFailure(new EndRunException(detector.getDetectorId(), CommonErrorMessages.INVALID_SEARCH_QUERY_MSG, e, true)); + listener.onFailure(new EndRunException(detector.getId(), CommonMessages.INVALID_SEARCH_QUERY_MSG, e, true)); } } @@ -295,18 +297,15 @@ private void getBucketAggregates( ActionListener listener, Map topEntity ) throws IOException { - AggregationBuilder aggregation = getBucketAggregation( - latestTime, - (IntervalTimeConfiguration) anomalyDetector.getDetectionInterval() - ); + AggregationBuilder aggregation = getBucketAggregation(latestTime, (IntervalTimeConfiguration) anomalyDetector.getInterval()); BoolQueryBuilder query = QueryBuilders.boolQuery().filter(anomalyDetector.getFilterQuery()); - if (anomalyDetector.isMultientityDetector()) { + if (anomalyDetector.isHighCardinality()) { if (topEntity.isEmpty()) { listener .onFailure( - new ADValidationException( - CommonErrorMessages.CATEGORY_FIELD_TOO_SPARSE, - DetectorValidationIssueType.CATEGORY, + new ValidationException( + CommonMessages.CATEGORY_FIELD_TOO_SPARSE, + ValidationIssueType.CATEGORY, ValidationAspect.MODEL ) ); @@ -332,7 +331,7 @@ private void getBucketAggregates( new ModelValidationActionHandler.DetectorIntervalRecommendationListener( intervalListener, searchRequest.source(), - (IntervalTimeConfiguration) anomalyDetector.getDetectionInterval(), + (IntervalTimeConfiguration) anomalyDetector.getInterval(), clock.millis() + TOP_VALIDATE_TIMEOUT_IN_MILLIS, latestTime, false, @@ -346,6 +345,7 @@ private void getBucketAggregates( client::search, user, client, + AnalysisType.AD, searchResponseListener ); } @@ -420,13 +420,13 @@ public void onResponse(SearchResponse response) { } else if (expirationEpochMs < clock.millis()) { listener .onFailure( - new ADValidationException( - CommonErrorMessages.TIMEOUT_ON_INTERVAL_REC, - DetectorValidationIssueType.TIMEOUT, + new ValidationException( + CommonMessages.TIMEOUT_ON_INTERVAL_REC, + ValidationIssueType.TIMEOUT, ValidationAspect.MODEL ) ); - logger.info(CommonErrorMessages.TIMEOUT_ON_INTERVAL_REC); + logger.info(CommonMessages.TIMEOUT_ON_INTERVAL_REC); // keep trying higher intervals as new interval is below max, and we aren't decreasing yet } else if (newIntervalMinute < MAX_INTERVAL_REC_LENGTH_IN_MINUTES && !decreasingInterval) { searchWithDifferentInterval(newIntervalMinute); @@ -434,7 +434,7 @@ public void onResponse(SearchResponse response) { // we aren't decreasing yet, at this point we will start decreasing for the first time // if we are inside the below block } else if (newIntervalMinute >= MAX_INTERVAL_REC_LENGTH_IN_MINUTES && !decreasingInterval) { - IntervalTimeConfiguration givenInterval = (IntervalTimeConfiguration) anomalyDetector.getDetectionInterval(); + IntervalTimeConfiguration givenInterval = (IntervalTimeConfiguration) anomalyDetector.getInterval(); this.detectorInterval = new IntervalTimeConfiguration( (long) Math .floor( @@ -463,6 +463,7 @@ public void onResponse(SearchResponse response) { client::search, user, client, + AnalysisType.AD, this ); // In this case decreasingInterval has to be true already, so we will stop @@ -497,6 +498,7 @@ private void searchWithDifferentInterval(long newIntervalMinuteValue) { client::search, user, client, + AnalysisType.AD, this ); } @@ -506,9 +508,9 @@ public void onFailure(Exception e) { logger.error("Failed to recommend new interval", e); listener .onFailure( - new ADValidationException( - CommonErrorMessages.MODEL_VALIDATION_FAILED_UNEXPECTEDLY, - DetectorValidationIssueType.AGGREGATION, + new ValidationException( + CommonMessages.MODEL_VALIDATION_FAILED_UNEXPECTEDLY, + ValidationIssueType.AGGREGATION, ValidationAspect.MODEL ) ); @@ -522,7 +524,7 @@ private void processIntervalRecommendation(IntervalTimeConfiguration interval, l if (interval == null) { checkRawDataSparsity(latestTime); } else { - if (interval.equals(anomalyDetector.getDetectionInterval())) { + if (interval.equals(anomalyDetector.getInterval())) { logger.info("Using the current interval there is enough dense data "); // Check if there is a window delay recommendation if everything else is successful and send exception if (Instant.now().toEpochMilli() - latestTime > timeConfigToMilliSec(anomalyDetector.getWindowDelay())) { @@ -536,9 +538,9 @@ private void processIntervalRecommendation(IntervalTimeConfiguration interval, l // return response with interval recommendation listener .onFailure( - new ADValidationException( - CommonErrorMessages.DETECTOR_INTERVAL_REC + interval.getInterval(), - DetectorValidationIssueType.DETECTION_INTERVAL, + new ValidationException( + CommonMessages.INTERVAL_REC + interval.getInterval(), + ValidationIssueType.DETECTION_INTERVAL, ValidationAspect.MODEL, interval ) @@ -560,10 +562,7 @@ private SearchSourceBuilder getSearchSourceBuilder(QueryBuilder query, Aggregati } private void checkRawDataSparsity(long latestTime) { - AggregationBuilder aggregation = getBucketAggregation( - latestTime, - (IntervalTimeConfiguration) anomalyDetector.getDetectionInterval() - ); + AggregationBuilder aggregation = getBucketAggregation(latestTime, (IntervalTimeConfiguration) anomalyDetector.getInterval()); SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder().aggregation(aggregation).size(0).timeout(requestTimeout); SearchRequest searchRequest = new SearchRequest(anomalyDetector.getIndices().toArray(new String[0])).source(searchSourceBuilder); final ActionListener searchResponseListener = ActionListener @@ -576,6 +575,7 @@ private void checkRawDataSparsity(long latestTime) { client::search, user, client, + AnalysisType.AD, searchResponseListener ); } @@ -589,9 +589,9 @@ private Histogram checkBucketResultErrors(SearchResponse response) { logger.warn("Unexpected null aggregation."); listener .onFailure( - new ADValidationException( - CommonErrorMessages.MODEL_VALIDATION_FAILED_UNEXPECTEDLY, - DetectorValidationIssueType.AGGREGATION, + new ValidationException( + CommonMessages.MODEL_VALIDATION_FAILED_UNEXPECTEDLY, + ValidationIssueType.AGGREGATION, ValidationAspect.MODEL ) ); @@ -614,11 +614,7 @@ private void processRawDataResults(SearchResponse response, long latestTime) { if (fullBucketRate < INTERVAL_BUCKET_MINIMUM_SUCCESS_RATE) { listener .onFailure( - new ADValidationException( - CommonErrorMessages.RAW_DATA_TOO_SPARSE, - DetectorValidationIssueType.INDICES, - ValidationAspect.MODEL - ) + new ValidationException(CommonMessages.RAW_DATA_TOO_SPARSE, ValidationIssueType.INDICES, ValidationAspect.MODEL) ); } else { checkDataFilterSparsity(latestTime); @@ -626,10 +622,7 @@ private void processRawDataResults(SearchResponse response, long latestTime) { } private void checkDataFilterSparsity(long latestTime) { - AggregationBuilder aggregation = getBucketAggregation( - latestTime, - (IntervalTimeConfiguration) anomalyDetector.getDetectionInterval() - ); + AggregationBuilder aggregation = getBucketAggregation(latestTime, (IntervalTimeConfiguration) anomalyDetector.getInterval()); BoolQueryBuilder query = QueryBuilders.boolQuery().filter(anomalyDetector.getFilterQuery()); SearchSourceBuilder searchSourceBuilder = getSearchSourceBuilder(query, aggregation); SearchRequest searchRequest = new SearchRequest(anomalyDetector.getIndices().toArray(new String[0])).source(searchSourceBuilder); @@ -643,6 +636,7 @@ private void checkDataFilterSparsity(long latestTime) { client::search, user, client, + AnalysisType.AD, searchResponseListener ); } @@ -656,16 +650,16 @@ private void processDataFilterResults(SearchResponse response, long latestTime) if (fullBucketRate < CONFIG_BUCKET_MINIMUM_SUCCESS_RATE) { listener .onFailure( - new ADValidationException( - CommonErrorMessages.FILTER_QUERY_TOO_SPARSE, - DetectorValidationIssueType.FILTER_QUERY, + new ValidationException( + CommonMessages.FILTER_QUERY_TOO_SPARSE, + ValidationIssueType.FILTER_QUERY, ValidationAspect.MODEL ) ); // blocks below are executed if data is dense enough with filter query applied. // If HCAD then category fields will be added to bucket aggregation to see if they // are the root cause of the issues and if not the feature queries will be checked for sparsity - } else if (anomalyDetector.isMultientityDetector()) { + } else if (anomalyDetector.isHighCardinality()) { getTopEntityForCategoryField(latestTime); } else { try { @@ -692,10 +686,7 @@ private void checkCategoryFieldSparsity(Map topEntity, long late for (Map.Entry entry : topEntity.entrySet()) { query.filter(QueryBuilders.termQuery(entry.getKey(), entry.getValue())); } - AggregationBuilder aggregation = getBucketAggregation( - latestTime, - (IntervalTimeConfiguration) anomalyDetector.getDetectionInterval() - ); + AggregationBuilder aggregation = getBucketAggregation(latestTime, (IntervalTimeConfiguration) anomalyDetector.getInterval()); SearchSourceBuilder searchSourceBuilder = getSearchSourceBuilder(query, aggregation); SearchRequest searchRequest = new SearchRequest(anomalyDetector.getIndices().toArray(new String[0])).source(searchSourceBuilder); final ActionListener searchResponseListener = ActionListener @@ -708,6 +699,7 @@ private void checkCategoryFieldSparsity(Map topEntity, long late client::search, user, client, + AnalysisType.AD, searchResponseListener ); } @@ -721,11 +713,7 @@ private void processTopEntityResults(SearchResponse response, long latestTime) { if (fullBucketRate < CONFIG_BUCKET_MINIMUM_SUCCESS_RATE) { listener .onFailure( - new ADValidationException( - CommonErrorMessages.CATEGORY_FIELD_TOO_SPARSE, - DetectorValidationIssueType.CATEGORY, - ValidationAspect.MODEL - ) + new ValidationException(CommonMessages.CATEGORY_FIELD_TOO_SPARSE, ValidationIssueType.CATEGORY, ValidationAspect.MODEL) ); } else { try { @@ -742,27 +730,18 @@ private void checkFeatureQueryDelegate(long latestTime) throws IOException { windowDelayRecommendation(latestTime); }, exception -> { listener - .onFailure( - new ADValidationException( - exception.getMessage(), - DetectorValidationIssueType.FEATURE_ATTRIBUTES, - ValidationAspect.MODEL - ) - ); + .onFailure(new ValidationException(exception.getMessage(), ValidationIssueType.FEATURE_ATTRIBUTES, ValidationAspect.MODEL)); }); MultiResponsesDelegateActionListener> multiFeatureQueriesResponseListener = new MultiResponsesDelegateActionListener<>( validateFeatureQueriesListener, anomalyDetector.getFeatureAttributes().size(), - CommonErrorMessages.FEATURE_QUERY_TOO_SPARSE, + CommonMessages.FEATURE_QUERY_TOO_SPARSE, false ); for (Feature feature : anomalyDetector.getFeatureAttributes()) { - AggregationBuilder aggregation = getBucketAggregation( - latestTime, - (IntervalTimeConfiguration) anomalyDetector.getDetectionInterval() - ); + AggregationBuilder aggregation = getBucketAggregation(latestTime, (IntervalTimeConfiguration) anomalyDetector.getInterval()); BoolQueryBuilder query = QueryBuilders.boolQuery().filter(anomalyDetector.getFilterQuery()); List featureFields = ParseUtils.getFieldNamesForFeature(feature, xContentRegistry); for (String featureField : featureFields) { @@ -780,9 +759,9 @@ private void checkFeatureQueryDelegate(long latestTime) throws IOException { if (fullBucketRate < CONFIG_BUCKET_MINIMUM_SUCCESS_RATE) { multiFeatureQueriesResponseListener .onFailure( - new ADValidationException( - CommonErrorMessages.FEATURE_QUERY_TOO_SPARSE, - DetectorValidationIssueType.FEATURE_ATTRIBUTES, + new ValidationException( + CommonMessages.FEATURE_QUERY_TOO_SPARSE, + ValidationIssueType.FEATURE_ATTRIBUTES, ValidationAspect.MODEL ) ); @@ -793,7 +772,7 @@ private void checkFeatureQueryDelegate(long latestTime) throws IOException { }, e -> { logger.error(e); multiFeatureQueriesResponseListener - .onFailure(new OpenSearchStatusException(CommonErrorMessages.FEATURE_QUERY_TOO_SPARSE, RestStatus.BAD_REQUEST, e)); + .onFailure(new OpenSearchStatusException(CommonMessages.FEATURE_QUERY_TOO_SPARSE, RestStatus.BAD_REQUEST, e)); }); // using the original context in listener as user roles have no permissions for internal operations like fetching a // checkpoint @@ -803,6 +782,7 @@ private void checkFeatureQueryDelegate(long latestTime) throws IOException { client::search, user, client, + AnalysisType.AD, searchResponseListener ); } @@ -812,9 +792,9 @@ private void sendWindowDelayRec(long latestTimeInMillis) { long minutesSinceLastStamp = (long) Math.ceil((Instant.now().toEpochMilli() - latestTimeInMillis) / 60000.0); listener .onFailure( - new ADValidationException( - String.format(Locale.ROOT, CommonErrorMessages.WINDOW_DELAY_REC, minutesSinceLastStamp, minutesSinceLastStamp), - DetectorValidationIssueType.WINDOW_DELAY, + new ValidationException( + String.format(Locale.ROOT, CommonMessages.WINDOW_DELAY_REC, minutesSinceLastStamp, minutesSinceLastStamp), + ValidationIssueType.WINDOW_DELAY, ValidationAspect.MODEL, new IntervalTimeConfiguration(minutesSinceLastStamp, ChronoUnit.MINUTES) ) @@ -836,23 +816,17 @@ private void windowDelayRecommendation(long latestTime) { // a time was always above 0.25 meaning the best suggestion is to simply ingest more data or change interval since // we have no more insight regarding the root cause of the lower density. listener - .onFailure( - new ADValidationException( - CommonErrorMessages.RAW_DATA_TOO_SPARSE, - DetectorValidationIssueType.INDICES, - ValidationAspect.MODEL - ) - ); + .onFailure(new ValidationException(CommonMessages.RAW_DATA_TOO_SPARSE, ValidationIssueType.INDICES, ValidationAspect.MODEL)); } private LongBounds getTimeRangeBounds(long endMillis, IntervalTimeConfiguration detectorIntervalInMinutes) { Long detectorInterval = timeConfigToMilliSec(detectorIntervalInMinutes); - Long startMillis = endMillis - ((long) getNumberOfSamples() * detectorInterval); + Long startMillis = endMillis - (getNumberOfSamples() * detectorInterval); return new LongBounds(startMillis, endMillis); } private int getNumberOfSamples() { - long interval = anomalyDetector.getDetectorIntervalInMilliseconds(); + long interval = anomalyDetector.getIntervalInMilliseconds(); return Math .max( (int) (Duration.ofHours(AnomalyDetectorSettings.TRAIN_SAMPLE_TIME_RANGE_IN_HOURS).toMillis() / interval), diff --git a/src/main/java/org/opensearch/ad/rest/handler/ValidateAnomalyDetectorActionHandler.java b/src/main/java/org/opensearch/ad/rest/handler/ValidateAnomalyDetectorActionHandler.java index d3c4c5e64..3c0b13c5e 100644 --- a/src/main/java/org/opensearch/ad/rest/handler/ValidateAnomalyDetectorActionHandler.java +++ b/src/main/java/org/opensearch/ad/rest/handler/ValidateAnomalyDetectorActionHandler.java @@ -13,11 +13,9 @@ import java.time.Clock; -import org.opensearch.ad.feature.SearchFeatureDao; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.transport.ValidateAnomalyDetectorResponse; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Settings; @@ -26,6 +24,8 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.core.xcontent.NamedXContentRegistry; import org.opensearch.rest.RestRequest; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.util.SecurityClientUtil; /** * Anomaly detector REST action handler to process POST request. @@ -59,7 +59,7 @@ public ValidateAnomalyDetectorActionHandler( Client client, SecurityClientUtil clientUtil, ActionListener listener, - AnomalyDetectionIndices anomalyDetectionIndices, + ADIndexManagement anomalyDetectionIndices, AnomalyDetector anomalyDetector, TimeValue requestTimeout, Integer maxSingleEntityAnomalyDetectors, diff --git a/src/main/java/org/opensearch/ad/settings/EnabledSetting.java b/src/main/java/org/opensearch/ad/settings/ADEnabledSetting.java similarity index 68% rename from src/main/java/org/opensearch/ad/settings/EnabledSetting.java rename to src/main/java/org/opensearch/ad/settings/ADEnabledSetting.java index 797df2a2b..ed4414f6c 100644 --- a/src/main/java/org/opensearch/ad/settings/EnabledSetting.java +++ b/src/main/java/org/opensearch/ad/settings/ADEnabledSetting.java @@ -20,37 +20,37 @@ import java.util.Map; import org.opensearch.common.settings.Setting; +import org.opensearch.timeseries.settings.DynamicNumericSetting; -public class EnabledSetting extends AbstractSetting { +public class ADEnabledSetting extends DynamicNumericSetting { /** * Singleton instance */ - private static EnabledSetting INSTANCE; + private static ADEnabledSetting INSTANCE; /** * Settings name */ - public static final String AD_PLUGIN_ENABLED = "plugins.anomaly_detection.enabled"; + public static final String AD_ENABLED = "plugins.anomaly_detection.enabled"; public static final String AD_BREAKER_ENABLED = "plugins.anomaly_detection.breaker.enabled"; - public static final String LEGACY_OPENDISTRO_AD_PLUGIN_ENABLED = "opendistro.anomaly_detection.enabled"; + public static final String LEGACY_OPENDISTRO_AD_ENABLED = "opendistro.anomaly_detection.enabled"; public static final String LEGACY_OPENDISTRO_AD_BREAKER_ENABLED = "opendistro.anomaly_detection.breaker.enabled"; public static final String INTERPOLATION_IN_HCAD_COLD_START_ENABLED = "plugins.anomaly_detection.hcad_cold_start_interpolation.enabled"; - public static final String DOOR_KEEPER_IN_CACHE_ENABLED = "plugins.anomaly_detection.door_keeper_in_cache.enabled";; + public static final String DOOR_KEEPER_IN_CACHE_ENABLED = "plugins.anomaly_detection.door_keeper_in_cache.enabled"; public static final Map> settings = unmodifiableMap(new HashMap>() { { - Setting LegacyADPluginEnabledSetting = Setting - .boolSetting(LEGACY_OPENDISTRO_AD_PLUGIN_ENABLED, true, NodeScope, Dynamic, Deprecated); + Setting LegacyADEnabledSetting = Setting.boolSetting(LEGACY_OPENDISTRO_AD_ENABLED, true, NodeScope, Dynamic, Deprecated); /** - * Legacy OpenDistro AD plugin enable/disable setting + * Legacy OpenDistro AD enable/disable setting */ - put(LEGACY_OPENDISTRO_AD_PLUGIN_ENABLED, LegacyADPluginEnabledSetting); + put(LEGACY_OPENDISTRO_AD_ENABLED, LegacyADEnabledSetting); Setting LegacyADBreakerEnabledSetting = Setting .boolSetting(LEGACY_OPENDISTRO_AD_BREAKER_ENABLED, true, NodeScope, Dynamic, Deprecated); @@ -60,9 +60,9 @@ public class EnabledSetting extends AbstractSetting { put(LEGACY_OPENDISTRO_AD_BREAKER_ENABLED, LegacyADBreakerEnabledSetting); /** - * AD plugin enable/disable setting + * AD enable/disable setting */ - put(AD_PLUGIN_ENABLED, Setting.boolSetting(AD_PLUGIN_ENABLED, LegacyADPluginEnabledSetting, NodeScope, Dynamic)); + put(AD_ENABLED, Setting.boolSetting(AD_ENABLED, LegacyADEnabledSetting, NodeScope, Dynamic)); /** * AD breaker enable/disable setting @@ -86,23 +86,23 @@ public class EnabledSetting extends AbstractSetting { } }); - private EnabledSetting(Map> settings) { + ADEnabledSetting(Map> settings) { super(settings); } - public static synchronized EnabledSetting getInstance() { + public static synchronized ADEnabledSetting getInstance() { if (INSTANCE == null) { - INSTANCE = new EnabledSetting(settings); + INSTANCE = new ADEnabledSetting(settings); } return INSTANCE; } /** - * Whether AD plugin is enabled. If disabled, AD plugin rejects RESTful requests and stop all AD jobs. - * @return whether AD plugin is enabled. + * Whether AD is enabled. If disabled, time series plugin rejects RESTful requests on AD and stop all AD jobs. + * @return whether AD is enabled. */ - public static boolean isADPluginEnabled() { - return EnabledSetting.getInstance().getSettingValue(EnabledSetting.AD_PLUGIN_ENABLED); + public static boolean isADEnabled() { + return ADEnabledSetting.getInstance().getSettingValue(ADEnabledSetting.AD_ENABLED); } /** @@ -110,7 +110,7 @@ public static boolean isADPluginEnabled() { * @return whether AD circuit breaker is enabled or not. */ public static boolean isADBreakerEnabled() { - return EnabledSetting.getInstance().getSettingValue(EnabledSetting.AD_BREAKER_ENABLED); + return ADEnabledSetting.getInstance().getSettingValue(ADEnabledSetting.AD_BREAKER_ENABLED); } /** @@ -118,7 +118,7 @@ public static boolean isADBreakerEnabled() { * @return wWhether interpolation in HCAD cold start is enabled or not. */ public static boolean isInterpolationInColdStartEnabled() { - return EnabledSetting.getInstance().getSettingValue(EnabledSetting.INTERPOLATION_IN_HCAD_COLD_START_ENABLED); + return ADEnabledSetting.getInstance().getSettingValue(ADEnabledSetting.INTERPOLATION_IN_HCAD_COLD_START_ENABLED); } /** @@ -126,6 +126,6 @@ public static boolean isInterpolationInColdStartEnabled() { * @return wWhether door keeper in cache is enabled or not. */ public static boolean isDoorKeeperInCacheEnabled() { - return EnabledSetting.getInstance().getSettingValue(EnabledSetting.DOOR_KEEPER_IN_CACHE_ENABLED); + return ADEnabledSetting.getInstance().getSettingValue(ADEnabledSetting.DOOR_KEEPER_IN_CACHE_ENABLED); } } diff --git a/src/main/java/org/opensearch/ad/settings/NumericSetting.java b/src/main/java/org/opensearch/ad/settings/ADNumericSetting.java similarity index 75% rename from src/main/java/org/opensearch/ad/settings/NumericSetting.java rename to src/main/java/org/opensearch/ad/settings/ADNumericSetting.java index eed8ac7ec..e064867a0 100644 --- a/src/main/java/org/opensearch/ad/settings/NumericSetting.java +++ b/src/main/java/org/opensearch/ad/settings/ADNumericSetting.java @@ -17,13 +17,14 @@ import java.util.Map; import org.opensearch.common.settings.Setting; +import org.opensearch.timeseries.settings.DynamicNumericSetting; -public class NumericSetting extends AbstractSetting { +public class ADNumericSetting extends DynamicNumericSetting { /** * Singleton instance */ - private static NumericSetting INSTANCE; + private static ADNumericSetting INSTANCE; /** * Settings name @@ -45,22 +46,21 @@ public class NumericSetting extends AbstractSetting { } }); - private NumericSetting(Map> settings) { + ADNumericSetting(Map> settings) { super(settings); } - public static synchronized NumericSetting getInstance() { + public static synchronized ADNumericSetting getInstance() { if (INSTANCE == null) { - INSTANCE = new NumericSetting(settings); + INSTANCE = new ADNumericSetting(settings); } return INSTANCE; } /** - * Whether AD plugin is enabled. If disabled, AD plugin rejects RESTful requests and stop all AD jobs. - * @return whether AD plugin is enabled. + * @return the max number of categorical fields */ public static int maxCategoricalFields() { - return NumericSetting.getInstance().getSettingValue(NumericSetting.CATEGORY_FIELD_LIMIT); + return ADNumericSetting.getInstance().getSettingValue(ADNumericSetting.CATEGORY_FIELD_LIMIT); } } diff --git a/src/main/java/org/opensearch/ad/settings/AnomalyDetectorSettings.java b/src/main/java/org/opensearch/ad/settings/AnomalyDetectorSettings.java index 75f5219e6..b5f10b383 100644 --- a/src/main/java/org/opensearch/ad/settings/AnomalyDetectorSettings.java +++ b/src/main/java/org/opensearch/ad/settings/AnomalyDetectorSettings.java @@ -11,10 +11,9 @@ package org.opensearch.ad.settings; -import java.time.Duration; - import org.opensearch.common.settings.Setting; import org.opensearch.common.unit.TimeValue; +import org.opensearch.timeseries.settings.TimeSeriesSettings; /** * AD plugin settings. @@ -24,7 +23,7 @@ public final class AnomalyDetectorSettings { private AnomalyDetectorSettings() {} public static final int MAX_DETECTOR_UPPER_LIMIT = 10000; - public static final Setting MAX_SINGLE_ENTITY_ANOMALY_DETECTORS = Setting + public static final Setting AD_MAX_SINGLE_ENTITY_ANOMALY_DETECTORS = Setting .intSetting( "plugins.anomaly_detection.max_anomaly_detectors", LegacyOpenDistroAnomalyDetectorSettings.MAX_SINGLE_ENTITY_ANOMALY_DETECTORS, @@ -34,7 +33,7 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting MAX_MULTI_ENTITY_ANOMALY_DETECTORS = Setting + public static final Setting AD_MAX_HC_ANOMALY_DETECTORS = Setting .intSetting( "plugins.anomaly_detection.max_multi_entity_anomaly_detectors", LegacyOpenDistroAnomalyDetectorSettings.MAX_MULTI_ENTITY_ANOMALY_DETECTORS, @@ -54,7 +53,7 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting REQUEST_TIMEOUT = Setting + public static final Setting AD_REQUEST_TIMEOUT = Setting .positiveTimeSetting( "plugins.anomaly_detection.request_timeout", LegacyOpenDistroAnomalyDetectorSettings.REQUEST_TIMEOUT, @@ -113,7 +112,13 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting MAX_RETRY_FOR_UNRESPONSIVE_NODE = Setting + /** + * @deprecated This setting is deprecated because we need to manage fault tolerance for + * multiple analysis such as AD and forecasting. + * Use TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE instead. + */ + @Deprecated + public static final Setting AD_MAX_RETRY_FOR_UNRESPONSIVE_NODE = Setting .intSetting( "plugins.anomaly_detection.max_retry_for_unresponsive_node", LegacyOpenDistroAnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE, @@ -122,7 +127,13 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting COOLDOWN_MINUTES = Setting + /** + * @deprecated This setting is deprecated because we need to manage fault tolerance for + * multiple analysis such as AD and forecasting. + * Use TimeSeriesSettings.COOLDOWN_MINUTES instead. + */ + @Deprecated + public static final Setting AD_COOLDOWN_MINUTES = Setting .positiveTimeSetting( "plugins.anomaly_detection.cooldown_minutes", LegacyOpenDistroAnomalyDetectorSettings.COOLDOWN_MINUTES, @@ -130,7 +141,13 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting BACKOFF_MINUTES = Setting + /** + * @deprecated This setting is deprecated because we need to manage fault tolerance for + * multiple analysis such as AD and forecasting. + * Use TimeSeriesSettings.BACKOFF_MINUTES instead. + */ + @Deprecated + public static final Setting AD_BACKOFF_MINUTES = Setting .positiveTimeSetting( "plugins.anomaly_detection.backoff_minutes", LegacyOpenDistroAnomalyDetectorSettings.BACKOFF_MINUTES, @@ -138,7 +155,7 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting BACKOFF_INITIAL_DELAY = Setting + public static final Setting AD_BACKOFF_INITIAL_DELAY = Setting .positiveTimeSetting( "plugins.anomaly_detection.backoff_initial_delay", LegacyOpenDistroAnomalyDetectorSettings.BACKOFF_INITIAL_DELAY, @@ -146,7 +163,7 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting MAX_RETRY_FOR_BACKOFF = Setting + public static final Setting AD_MAX_RETRY_FOR_BACKOFF = Setting .intSetting( "plugins.anomaly_detection.max_retry_for_backoff", LegacyOpenDistroAnomalyDetectorSettings.MAX_RETRY_FOR_BACKOFF, @@ -155,7 +172,7 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting MAX_RETRY_FOR_END_RUN_EXCEPTION = Setting + public static final Setting AD_MAX_RETRY_FOR_END_RUN_EXCEPTION = Setting .intSetting( "plugins.anomaly_detection.max_retry_for_end_run_exception", LegacyOpenDistroAnomalyDetectorSettings.MAX_RETRY_FOR_END_RUN_EXCEPTION, @@ -164,28 +181,24 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting FILTER_BY_BACKEND_ROLES = Setting + public static final Setting AD_FILTER_BY_BACKEND_ROLES = Setting .boolSetting( "plugins.anomaly_detection.filter_by_backend_roles", - LegacyOpenDistroAnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES, + LegacyOpenDistroAnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES, Setting.Property.NodeScope, Setting.Property.Dynamic ); - public static final String ANOMALY_DETECTORS_INDEX_MAPPING_FILE = "mappings/anomaly-detectors.json"; - public static final String ANOMALY_DETECTOR_JOBS_INDEX_MAPPING_FILE = "mappings/anomaly-detector-jobs.json"; public static final String ANOMALY_RESULTS_INDEX_MAPPING_FILE = "mappings/anomaly-results.json"; public static final String ANOMALY_DETECTION_STATE_INDEX_MAPPING_FILE = "mappings/anomaly-detection-state.json"; - public static final String CHECKPOINT_INDEX_MAPPING_FILE = "mappings/checkpoint.json"; - - public static final Duration HOURLY_MAINTENANCE = Duration.ofHours(1); + public static final String CHECKPOINT_INDEX_MAPPING_FILE = "mappings/anomaly-checkpoint.json"; // saving checkpoint every 12 hours. // To support 1 million entities in 36 data nodes, each node has roughly 28K models. // In each hour, we roughly need to save 2400 models. Since each model saving can - // take about 1 seconds (default value of AnomalyDetectorSettings.EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_SECS) + // take about 1 seconds (default value of AD_EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS) // we can use up to 2400 seconds to finish saving checkpoints. - public static final Setting CHECKPOINT_SAVING_FREQ = Setting + public static final Setting AD_CHECKPOINT_SAVING_FREQ = Setting .positiveTimeSetting( "plugins.anomaly_detection.checkpoint_saving_freq", TimeValue.timeValueHours(12), @@ -193,7 +206,7 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting CHECKPOINT_TTL = Setting + public static final Setting AD_CHECKPOINT_TTL = Setting .positiveTimeSetting( "plugins.anomaly_detection.checkpoint_ttl", TimeValue.timeValueDays(7), @@ -204,62 +217,16 @@ private AnomalyDetectorSettings() {} // ====================================== // ML parameters // ====================================== - // RCF - public static final int NUM_SAMPLES_PER_TREE = 256; - - public static final int NUM_TREES = 30; - - public static final int TRAINING_SAMPLE_INTERVAL = 64; - - public static final double TIME_DECAY = 0.0001; - - // If we have 32 + shingleSize (hopefully recent) values, RCF can get up and running. It will be noisy — - // there is a reason that default size is 256 (+ shingle size), but it may be more useful for people to - /// start seeing some results. - public static final int NUM_MIN_SAMPLES = 32; - - // The threshold for splitting RCF models in single-stream detectors. - // The smallest machine in the Amazon managed service has 1GB heap. - // With the setting, the desired model size there is of 2 MB. - // By default, we can have at most 5 features. Since the default shingle size - // is 8, we have at most 40 dimensions in RCF. In our current RCF setting, - // 30 trees, and bounding box cache ratio 0, 40 dimensions use 449KB. - // Users can increase the number of features to 10 and shingle size to 60, - // 30 trees, bounding box cache ratio 0, 600 dimensions use 1.8 MB. - // Since these sizes are smaller than the threshold 2 MB, we won't split models - // even in the smallest machine. - public static final double DESIRED_MODEL_SIZE_PERCENTAGE = 0.002; - - public static final Setting MODEL_MAX_SIZE_PERCENTAGE = Setting + public static final Setting AD_MODEL_MAX_SIZE_PERCENTAGE = Setting .doubleSetting( "plugins.anomaly_detection.model_max_size_percent", LegacyOpenDistroAnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE, 0, - 0.7, + 0.9, Setting.Property.NodeScope, Setting.Property.Dynamic ); - // for a batch operation, we want all of the bounding box in-place for speed - public static final double BATCH_BOUNDING_BOX_CACHE_RATIO = 1; - - // for a real-time operation, we trade off speed for memory as real time opearation - // only has to do one update/scoring per interval - public static final double REAL_TIME_BOUNDING_BOX_CACHE_RATIO = 0; - - public static final int DEFAULT_SHINGLE_SIZE = 8; - - // max shingle size we have seen from external users - // the larger shingle size, the harder to fill in a complete shingle - public static final int MAX_SHINGLE_SIZE = 60; - - // Thresholding - public static final double THRESHOLD_MIN_PVALUE = 0.995; - - public static final double THRESHOLD_MAX_RANK_ERROR = 0.0001; - - public static final double THRESHOLD_MAX_SCORE = 8; - public static final int THRESHOLD_NUM_LOGNORMAL_QUANTILES = 400; public static final int THRESHOLD_DOWNSAMPLES = 5_000; @@ -280,9 +247,6 @@ private AnomalyDetectorSettings() {} // shingling public static final double MAX_SHINGLE_PROPORTION_MISSING = 0.25; - // AD JOB - public static final long DEFAULT_AD_JOB_LOC_DURATION_SECONDS = 60; - // Thread pool public static final int AD_THEAD_POOL_QUEUE_SIZE = 1000; @@ -304,7 +268,7 @@ private AnomalyDetectorSettings() {} * Other detectors cannot use space reserved by a detector's dedicated cache. * DEDICATED_CACHE_SIZE is a setting to make dedicated cache's size flexible. * When that setting is changed, if the size decreases, we will release memory - * if required (e.g., when a user also decreased AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE, + * if required (e.g., when a user also decreased AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE, * the max memory percentage that AD can use); * if the size increases, we may reject the setting change if we cannot fulfill * that request (e.g., when it will uses more memory than allowed for AD). @@ -316,42 +280,19 @@ private AnomalyDetectorSettings() {} * where 3.2 GB is from 10% memory limit of AD plugin. * That's why I am using 60_000 as the max limit. */ - public static final Setting DEDICATED_CACHE_SIZE = Setting + public static final Setting AD_DEDICATED_CACHE_SIZE = Setting .intSetting("plugins.anomaly_detection.dedicated_cache_size", 10, 0, 60_000, Setting.Property.NodeScope, Setting.Property.Dynamic); // We only keep priority (4 bytes float) in inactive cache. 1 million priorities // take up 4 MB. public static final int MAX_INACTIVE_ENTITIES = 1_000_000; - // 1 million insertion costs roughly 1 MB. - public static final int DOOR_KEEPER_FOR_CACHE_MAX_INSERTION = 1_000_000; - - // 100,000 insertions costs roughly 1KB. - public static final int DOOR_KEEPER_FOR_COLD_STARTER_MAX_INSERTION = 100_000; - - public static final double DOOR_KEEPER_FAULSE_POSITIVE_RATE = 0.01; - - // clean up door keeper every 60 intervals - public static final int DOOR_KEEPER_MAINTENANCE_FREQ = 60; - - // Increase the value will adding pressure to indexing anomaly results and our feature query - // OpenSearch-only setting as previous the legacy default is too low (1000) - public static final Setting MAX_ENTITIES_PER_QUERY = Setting - .intSetting( - "plugins.anomaly_detection.max_entities_per_query", - 1_000_000, - 0, - 2_000_000, - Setting.Property.NodeScope, - Setting.Property.Dynamic - ); - // save partial zero-anomaly grade results after indexing pressure reaching the limit // Opendistro version has similar setting. I lowered the value to make room // for INDEX_PRESSURE_HARD_LIMIT. I don't find a floatSetting that has both default // and fallback values. I want users to use the new default value 0.6 instead of 0.8. // So do not plan to use the value of legacy setting as fallback. - public static final Setting INDEX_PRESSURE_SOFT_LIMIT = Setting + public static final Setting AD_INDEX_PRESSURE_SOFT_LIMIT = Setting .floatSetting( "plugins.anomaly_detection.index_pressure_soft_limit", 0.6f, @@ -363,7 +304,7 @@ private AnomalyDetectorSettings() {} // save only error or larger-than-one anomaly grade results after indexing // pressure reaching the limit // opensearch-only setting - public static final Setting INDEX_PRESSURE_HARD_LIMIT = Setting + public static final Setting AD_INDEX_PRESSURE_HARD_LIMIT = Setting .floatSetting( "plugins.anomaly_detection.index_pressure_hard_limit", 0.9f, @@ -373,7 +314,7 @@ private AnomalyDetectorSettings() {} ); // max number of primary shards of an AD index - public static final Setting MAX_PRIMARY_SHARDS = Setting + public static final Setting AD_MAX_PRIMARY_SHARDS = Setting .intSetting( "plugins.anomaly_detection.max_primary_shards", LegacyOpenDistroAnomalyDetectorSettings.MAX_PRIMARY_SHARDS, @@ -383,12 +324,6 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - // max entity value's length - public static int MAX_ENTITY_LENGTH = 256; - - // number of bulk checkpoints per second - public static double CHECKPOINT_BULK_PER_SECOND = 0.02; - // ====================================== // Historical analysis // ====================================== @@ -404,6 +339,8 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); + // Use TimeSeriesSettings.MAX_CACHED_DELETED_TASKS for both AD and forecasting + @Deprecated // Maximum number of deleted tasks can keep in cache. public static final Setting MAX_CACHED_DELETED_TASKS = Setting .intSetting( @@ -430,13 +367,12 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final int MAX_BATCH_TASK_PIECE_SIZE = 10_000; public static final Setting BATCH_TASK_PIECE_SIZE = Setting .intSetting( "plugins.anomaly_detection.batch_task_piece_size", LegacyOpenDistroAnomalyDetectorSettings.BATCH_TASK_PIECE_SIZE, 1, - MAX_BATCH_TASK_PIECE_SIZE, + TimeSeriesSettings.MAX_BATCH_TASK_PIECE_SIZE, Setting.Property.NodeScope, Setting.Property.Dynamic ); @@ -478,7 +414,7 @@ private AnomalyDetectorSettings() {} // ====================================== // the percentage of heap usage allowed for queues holding small requests // set it to 0 to disable the queue - public static final Setting COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT = Setting + public static final Setting AD_COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT = Setting .floatSetting( "plugins.anomaly_detection.cold_entity_queue_max_heap_percent", 0.001f, @@ -487,7 +423,7 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT = Setting + public static final Setting AD_CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT = Setting .floatSetting( "plugins.anomaly_detection.checkpoint_read_queue_max_heap_percent", 0.001f, @@ -496,7 +432,7 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT = Setting + public static final Setting AD_ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT = Setting .floatSetting( "plugins.anomaly_detection.entity_cold_start_queue_max_heap_percent", 0.001f, @@ -507,7 +443,7 @@ private AnomalyDetectorSettings() {} // the percentage of heap usage allowed for queues holding large requests // set it to 0 to disable the queue - public static final Setting CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT = Setting + public static final Setting AD_CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT = Setting .floatSetting( "plugins.anomaly_detection.checkpoint_write_queue_max_heap_percent", 0.01f, @@ -516,7 +452,7 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT = Setting + public static final Setting AD_RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT = Setting .floatSetting( "plugins.anomaly_detection.result_write_queue_max_heap_percent", 0.01f, @@ -525,7 +461,7 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Setting CHECKPOINT_MAINTAIN_QUEUE_MAX_HEAP_PERCENT = Setting + public static final Setting AD_CHECKPOINT_MAINTAIN_QUEUE_MAX_HEAP_PERCENT = Setting .floatSetting( "plugins.anomaly_detection.checkpoint_maintain_queue_max_heap_percent", 0.001f, @@ -537,7 +473,7 @@ private AnomalyDetectorSettings() {} // expected execution time per cold entity request. This setting controls // the speed of cold entity requests execution. The larger, the faster, and // the more performance impact to customers' workload. - public static final Setting EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS = Setting + public static final Setting AD_EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS = Setting .intSetting( "plugins.anomaly_detection.expected_cold_entity_execution_time_in_millisecs", 3000, @@ -550,7 +486,7 @@ private AnomalyDetectorSettings() {} // expected execution time per checkpoint maintain request. This setting controls // the speed of checkpoint maintenance execution. The larger, the faster, and // the more performance impact to customers' workload. - public static final Setting EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS = Setting + public static final Setting AD_EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS = Setting .intSetting( "plugins.anomaly_detection.expected_checkpoint_maintain_time_in_millisecs", 1000, @@ -560,73 +496,10 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - /** - * EntityRequest has entityName (# category fields * 256, the recommended limit - * of a keyword field length), model Id (roughly 256 bytes), and QueuedRequest - * fields including detector Id(roughly 128 bytes), expirationEpochMs (long, - * 8 bytes), and priority (12 bytes). - * Plus Java object size (12 bytes), we have roughly 928 bytes per request - * assuming we have 2 categorical fields (plan to support 2 categorical fields now). - * We don't want the total size exceeds 0.1% of the heap. - * We can have at most 0.1% heap / 928 = heap / 928,000. - * For t3.small, 0.1% heap is of 1MB. The queue's size is up to - * 10^ 6 / 928 = 1078 - */ - public static int ENTITY_REQUEST_SIZE_IN_BYTES = 928; - - /** - * EntityFeatureRequest consists of EntityRequest (928 bytes, read comments - * of ENTITY_COLD_START_QUEUE_SIZE_CONSTANT), pointer to current feature - * (8 bytes), and dataStartTimeMillis (8 bytes). We have roughly - * 928 + 16 = 944 bytes per request. - * - * We don't want the total size exceeds 0.1% of the heap. - * We should have at most 0.1% heap / 944 = heap / 944,000 - * For t3.small, 0.1% heap is of 1MB. The queue's size is up to - * 10^ 6 / 944 = 1059 - */ - public static int ENTITY_FEATURE_REQUEST_SIZE_IN_BYTES = 944; - - /** - * ResultWriteRequest consists of index request (roughly 1KB), and QueuedRequest - * fields (148 bytes, read comments of ENTITY_REQUEST_SIZE_CONSTANT). - * Plus Java object size (12 bytes), we have roughly 1160 bytes per request - * - * We don't want the total size exceeds 1% of the heap. - * We should have at most 1% heap / 1148 = heap / 116,000 - * For t3.small, 1% heap is of 10MB. The queue's size is up to - * 10^ 7 / 1160 = 8621 - */ - public static int RESULT_WRITE_QUEUE_SIZE_IN_BYTES = 1160; - - /** - * CheckpointWriteRequest consists of IndexRequest (200 KB), and QueuedRequest - * fields (148 bytes, read comments of ENTITY_REQUEST_SIZE_CONSTANT). - * The total is roughly 200 KB per request. - * - * We don't want the total size exceeds 1% of the heap. - * We should have at most 1% heap / 200KB = heap / 20,000,000 - * For t3.small, 1% heap is of 10MB. The queue's size is up to - * 10^ 7 / 2.0 * 10^5 = 50 - */ - public static int CHECKPOINT_WRITE_QUEUE_SIZE_IN_BYTES = 200_000; - - /** - * CheckpointMaintainRequest has model Id (roughly 256 bytes), and QueuedRequest - * fields including detector Id(roughly 128 bytes), expirationEpochMs (long, - * 8 bytes), and priority (12 bytes). - * Plus Java object size (12 bytes), we have roughly 416 bytes per request. - * We don't want the total size exceeds 0.1% of the heap. - * We can have at most 0.1% heap / 416 = heap / 416,000. - * For t3.small, 0.1% heap is of 1MB. The queue's size is up to - * 10^ 6 / 416 = 2403 - */ - public static int CHECKPOINT_MAINTAIN_REQUEST_SIZE_IN_BYTES = 416; - /** * Max concurrent entity cold starts per node */ - public static final Setting ENTITY_COLD_START_QUEUE_CONCURRENCY = Setting + public static final Setting AD_ENTITY_COLD_START_QUEUE_CONCURRENCY = Setting .intSetting( "plugins.anomaly_detection.entity_cold_start_queue_concurrency", 1, @@ -639,7 +512,7 @@ private AnomalyDetectorSettings() {} /** * Max concurrent checkpoint reads per node */ - public static final Setting CHECKPOINT_READ_QUEUE_CONCURRENCY = Setting + public static final Setting AD_CHECKPOINT_READ_QUEUE_CONCURRENCY = Setting .intSetting( "plugins.anomaly_detection.checkpoint_read_queue_concurrency", 1, @@ -652,7 +525,7 @@ private AnomalyDetectorSettings() {} /** * Max concurrent checkpoint writes per node */ - public static final Setting CHECKPOINT_WRITE_QUEUE_CONCURRENCY = Setting + public static final Setting AD_CHECKPOINT_WRITE_QUEUE_CONCURRENCY = Setting .intSetting( "plugins.anomaly_detection.checkpoint_write_queue_concurrency", 2, @@ -666,7 +539,7 @@ private AnomalyDetectorSettings() {} * Max concurrent result writes per node. Since checkpoint is relatively large * (250KB), we have 2 concurrent threads processing the queue. */ - public static final Setting RESULT_WRITE_QUEUE_CONCURRENCY = Setting + public static final Setting AD_RESULT_WRITE_QUEUE_CONCURRENCY = Setting .intSetting( "plugins.anomaly_detection.result_write_queue_concurrency", 2, @@ -679,7 +552,7 @@ private AnomalyDetectorSettings() {} /** * Assume each checkpoint takes roughly 200KB. 25 requests are of 5 MB. */ - public static final Setting CHECKPOINT_READ_QUEUE_BATCH_SIZE = Setting + public static final Setting AD_CHECKPOINT_READ_QUEUE_BATCH_SIZE = Setting .intSetting( "plugins.anomaly_detection.checkpoint_read_queue_batch_size", 25, @@ -694,7 +567,7 @@ private AnomalyDetectorSettings() {} * ref: https://tinyurl.com/3zdbmbwy * Assume each checkpoint takes roughly 200KB. 25 requests are of 5 MB. */ - public static final Setting CHECKPOINT_WRITE_QUEUE_BATCH_SIZE = Setting + public static final Setting AD_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE = Setting .intSetting( "plugins.anomaly_detection.checkpoint_write_queue_batch_size", 25, @@ -709,7 +582,7 @@ private AnomalyDetectorSettings() {} * ref: https://tinyurl.com/3zdbmbwy * Assume each result takes roughly 1KB. 5000 requests are of 5 MB. */ - public static final Setting RESULT_WRITE_QUEUE_BATCH_SIZE = Setting + public static final Setting AD_RESULT_WRITE_QUEUE_BATCH_SIZE = Setting .intSetting( "plugins.anomaly_detection.result_write_queue_batch_size", 5000, @@ -719,48 +592,55 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - public static final Duration QUEUE_MAINTENANCE = Duration.ofMinutes(10); - - public static final float MAX_QUEUED_TASKS_RATIO = 0.5f; - - public static final float MEDIUM_SEGMENT_PRUNE_RATIO = 0.1f; - - public static final float LOW_SEGMENT_PRUNE_RATIO = 0.3f; - - // expensive maintenance (e.g., queue maintenance) with 1/10000 probability - public static final int MAINTENANCE_FREQ_CONSTANT = 10000; + /** + * EntityRequest has entityName (# category fields * 256, the recommended limit + * of a keyword field length), model Id (roughly 256 bytes), and QueuedRequest + * fields including detector Id(roughly 128 bytes), expirationEpochMs (long, + * 8 bytes), and priority (12 bytes). + * Plus Java object size (12 bytes), we have roughly 928 bytes per request + * assuming we have 2 categorical fields (plan to support 2 categorical fields now). + * We don't want the total size exceeds 0.1% of the heap. + * We can have at most 0.1% heap / 928 = heap / 928,000. + * For t3.small, 0.1% heap is of 1MB. The queue's size is up to + * 10^ 6 / 928 = 1078 + */ + // to be replaced by TimeSeriesSettings.FEATURE_REQUEST_SIZE_IN_BYTES + @Deprecated + public static int ENTITY_REQUEST_SIZE_IN_BYTES = 928; - // ====================================== - // Checkpoint setting - // ====================================== - // we won't accept a checkpoint larger than 30MB. Or we risk OOM. - // For reference, in RCF 1.0, the checkpoint of a RCF with 50 trees, 10 dimensions, - // 256 samples is of 3.2MB. - // In compact rcf, the same RCF is of 163KB. - // Since we allow at most 5 features, and the default shingle size is 8 and default - // tree number size is 100, we can have at most 25.6 MB in RCF 1.0. - // It is possible that cx increases the max features or shingle size, but we don't want - // to risk OOM for the flexibility. - public static final int MAX_CHECKPOINT_BYTES = 30_000_000; - - // Sets the cap on the number of buffer that can be allocated by the rcf deserialization - // buffer pool. Each buffer is of 512 bytes. Memory occupied by 20 buffers is 10.24 KB. - public static final int MAX_TOTAL_RCF_SERIALIZATION_BUFFERS = 20; - - // the size of the buffer used for rcf deserialization - public static final int SERIALIZATION_BUFFER_BYTES = 512; + /** + * EntityFeatureRequest consists of EntityRequest (928 bytes, read comments + * of ENTITY_COLD_START_QUEUE_SIZE_CONSTANT), pointer to current feature + * (8 bytes), and dataStartTimeMillis (8 bytes). We have roughly + * 928 + 16 = 944 bytes per request. + * + * We don't want the total size exceeds 0.1% of the heap. + * We should have at most 0.1% heap / 944 = heap / 944,000 + * For t3.small, 0.1% heap is of 1MB. The queue's size is up to + * 10^ 6 / 944 = 1059 + */ + // to be replaced by TimeSeriesSettings.FEATURE_REQUEST_SIZE_IN_BYTES + @Deprecated + public static int ENTITY_FEATURE_REQUEST_SIZE_IN_BYTES = 944; // ====================================== // pagination setting // ====================================== // pagination size - public static final Setting PAGE_SIZE = Setting + public static final Setting AD_PAGE_SIZE = Setting .intSetting("plugins.anomaly_detection.page_size", 1_000, 0, 10_000, Setting.Property.NodeScope, Setting.Property.Dynamic); - // within an interval, how many percents are used to process requests. - // 1.0 means we use all of the detection interval to process requests. - // to ensure we don't block next interval, it is better to set it less than 1.0. - public static final float INTERVAL_RATIO_FOR_REQUESTS = 0.9f; + // Increase the value will adding pressure to indexing anomaly results and our feature query + // OpenSearch-only setting as previous the legacy default is too low (1000) + public static final Setting AD_MAX_ENTITIES_PER_QUERY = Setting + .intSetting( + "plugins.anomaly_detection.max_entities_per_query", + 1_000_000, + 0, + 2_000_000, + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); // ====================================== // preview setting @@ -811,7 +691,7 @@ private AnomalyDetectorSettings() {} // ====================================== // the max number of models to return per node. // the setting is used to limit resource usage due to showing models - public static final Setting MAX_MODEL_SIZE_PER_NODE = Setting + public static final Setting AD_MAX_MODEL_SIZE_PER_NODE = Setting .intSetting( "plugins.anomaly_detection.max_model_size_per_node", 100, @@ -821,25 +701,6 @@ private AnomalyDetectorSettings() {} Setting.Property.Dynamic ); - // profile API needs to report total entities. We can use cardinality aggregation for a single-category field. - // But we cannot do that for multi-category fields as it requires scripting to generate run time fields, - // which is expensive. We work around the problem by using a composite query to find the first 10_000 buckets. - // Generally, traversing all buckets/combinations can't be done without visiting all matches, which is costly - // for data with many entities. Given that it is often enough to have a lower bound of the number of entities, - // such as "there are at least 10000 entities", the default is set to 10,000. That is, requests will count the - // total entities up to 10,000. - public static final int MAX_TOTAL_ENTITIES_TO_TRACK = 10_000; - - // ====================================== - // AD Index setting - // ====================================== - public static int MAX_UPDATE_RETRY_TIMES = 10_000; - - // ====================================== - // Cold start setting - // ====================================== - public static int MAX_COLD_START_ROUNDS = 2; - // ====================================== // Validate Detector API setting // ====================================== diff --git a/src/main/java/org/opensearch/ad/settings/LegacyOpenDistroAnomalyDetectorSettings.java b/src/main/java/org/opensearch/ad/settings/LegacyOpenDistroAnomalyDetectorSettings.java index d8ca4b777..f552e1f5d 100644 --- a/src/main/java/org/opensearch/ad/settings/LegacyOpenDistroAnomalyDetectorSettings.java +++ b/src/main/java/org/opensearch/ad/settings/LegacyOpenDistroAnomalyDetectorSettings.java @@ -170,7 +170,7 @@ private LegacyOpenDistroAnomalyDetectorSettings() {} Setting.Property.Deprecated ); - public static final Setting FILTER_BY_BACKEND_ROLES = Setting + public static final Setting AD_FILTER_BY_BACKEND_ROLES = Setting .boolSetting( "opendistro.anomaly_detection.filter_by_backend_roles", false, diff --git a/src/main/java/org/opensearch/ad/stats/suppliers/ModelsOnNodeSupplier.java b/src/main/java/org/opensearch/ad/stats/suppliers/ModelsOnNodeSupplier.java index cf89e638c..2cdee5fb8 100644 --- a/src/main/java/org/opensearch/ad/stats/suppliers/ModelsOnNodeSupplier.java +++ b/src/main/java/org/opensearch/ad/stats/suppliers/ModelsOnNodeSupplier.java @@ -14,7 +14,7 @@ import static org.opensearch.ad.ml.ModelState.LAST_CHECKPOINT_TIME_KEY; import static org.opensearch.ad.ml.ModelState.LAST_USED_TIME_KEY; import static org.opensearch.ad.ml.ModelState.MODEL_TYPE_KEY; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_MODEL_SIZE_PER_NODE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_MAX_MODEL_SIZE_PER_NODE; import java.util.ArrayList; import java.util.Arrays; @@ -27,10 +27,11 @@ import java.util.stream.Stream; import org.opensearch.ad.caching.CacheProvider; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.ml.ModelManager; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Settings; +import org.opensearch.timeseries.constant.CommonName; /** * ModelsOnNodeSupplier provides a List of ModelStates info for the models the nodes contains @@ -47,8 +48,8 @@ public class ModelsOnNodeSupplier implements Supplier>> public static Set MODEL_STATE_STAT_KEYS = new HashSet<>( Arrays .asList( - CommonName.MODEL_ID_KEY, - CommonName.DETECTOR_ID_KEY, + CommonName.MODEL_ID_FIELD, + ADCommonName.DETECTOR_ID_KEY, MODEL_TYPE_KEY, CommonName.ENTITY_KEY, LAST_USED_TIME_KEY, @@ -67,8 +68,8 @@ public class ModelsOnNodeSupplier implements Supplier>> public ModelsOnNodeSupplier(ModelManager modelManager, CacheProvider cache, Settings settings, ClusterService clusterService) { this.modelManager = modelManager; this.cache = cache; - this.numModelsToReturn = MAX_MODEL_SIZE_PER_NODE.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(MAX_MODEL_SIZE_PER_NODE, it -> this.numModelsToReturn = it); + this.numModelsToReturn = AD_MAX_MODEL_SIZE_PER_NODE.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_MAX_MODEL_SIZE_PER_NODE, it -> this.numModelsToReturn = it); } @Override diff --git a/src/main/java/org/opensearch/ad/task/ADBatchTaskCache.java b/src/main/java/org/opensearch/ad/task/ADBatchTaskCache.java index 646433693..05897fe64 100644 --- a/src/main/java/org/opensearch/ad/task/ADBatchTaskCache.java +++ b/src/main/java/org/opensearch/ad/task/ADBatchTaskCache.java @@ -11,10 +11,10 @@ package org.opensearch.ad.task; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.NUM_MIN_SAMPLES; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.NUM_TREES; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.TIME_DECAY; +import static org.opensearch.timeseries.settings.TimeSeriesSettings.NUM_MIN_SAMPLES; +import static org.opensearch.timeseries.settings.TimeSeriesSettings.NUM_SAMPLES_PER_TREE; +import static org.opensearch.timeseries.settings.TimeSeriesSettings.NUM_TREES; +import static org.opensearch.timeseries.settings.TimeSeriesSettings.TIME_DECAY; import java.util.ArrayDeque; import java.util.Deque; @@ -26,8 +26,8 @@ import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.settings.AnomalyDetectorSettings; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import com.amazon.randomcutforest.config.Precision; import com.amazon.randomcutforest.config.TransformMethod; @@ -56,9 +56,9 @@ public class ADBatchTaskCache { private Entity entity; protected ADBatchTaskCache(ADTask adTask) { - this.detectorId = adTask.getDetectorId(); + this.detectorId = adTask.getConfigId(); this.taskId = adTask.getTaskId(); - this.detectorTaskId = adTask.getDetectorLevelTaskId(); + this.detectorTaskId = adTask.getConfigLevelTaskId(); this.entity = adTask.getEntity(); AnomalyDetector detector = adTask.getDetector(); @@ -78,9 +78,9 @@ protected ADBatchTaskCache(ADTask adTask) { .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.BATCH_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.BATCH_BOUNDING_BOX_CACHE_RATIO) .shingleSize(shingleSize) - .anomalyRate(1 - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE) + .anomalyRate(1 - TimeSeriesSettings.THRESHOLD_MIN_PVALUE) .transformMethod(TransformMethod.NORMALIZE) .alertOnce(true) .autoAdjust(true) @@ -90,7 +90,7 @@ protected ADBatchTaskCache(ADTask adTask) { this.thresholdModelTrained = false; } - protected String getDetectorId() { + protected String getId() { return detectorId; } diff --git a/src/main/java/org/opensearch/ad/task/ADBatchTaskRunner.java b/src/main/java/org/opensearch/ad/task/ADBatchTaskRunner.java index 18844f860..f25b09af4 100644 --- a/src/main/java/org/opensearch/ad/task/ADBatchTaskRunner.java +++ b/src/main/java/org/opensearch/ad/task/ADBatchTaskRunner.java @@ -11,11 +11,7 @@ package org.opensearch.ad.task; -import static org.opensearch.ad.AnomalyDetectorPlugin.AD_BATCH_TASK_THREAD_POOL_NAME; -import static org.opensearch.ad.breaker.MemoryCircuitBreaker.DEFAULT_JVM_HEAP_USAGE_THRESHOLD; -import static org.opensearch.ad.constant.CommonErrorMessages.NO_ELIGIBLE_NODE_TO_RUN_DETECTOR; -import static org.opensearch.ad.constant.CommonName.AGG_NAME_MAX_TIME; -import static org.opensearch.ad.constant.CommonName.AGG_NAME_MIN_TIME; +import static org.opensearch.ad.constant.ADCommonMessages.NO_ELIGIBLE_NODE_TO_RUN_DETECTOR; import static org.opensearch.ad.model.ADTask.CURRENT_PIECE_FIELD; import static org.opensearch.ad.model.ADTask.EXECUTION_END_TIME_FIELD; import static org.opensearch.ad.model.ADTask.INIT_PROGRESS_FIELD; @@ -28,10 +24,12 @@ import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_RUNNING_ENTITIES_PER_DETECTOR_FOR_HISTORICAL_ANALYSIS; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_TOP_ENTITIES_FOR_HISTORICAL_ANALYSIS; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_TOP_ENTITIES_LIMIT_FOR_HISTORICAL_ANALYSIS; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.NUM_MIN_SAMPLES; import static org.opensearch.ad.stats.InternalStatNames.JVM_HEAP_USAGE; -import static org.opensearch.ad.stats.StatNames.AD_EXECUTING_BATCH_TASK_COUNT; -import static org.opensearch.ad.util.ParseUtils.isNullOrEmpty; +import static org.opensearch.timeseries.TimeSeriesAnalyticsPlugin.AD_BATCH_TASK_THREAD_POOL_NAME; +import static org.opensearch.timeseries.breaker.MemoryCircuitBreaker.DEFAULT_JVM_HEAP_USAGE_THRESHOLD; +import static org.opensearch.timeseries.settings.TimeSeriesSettings.NUM_MIN_SAMPLES; +import static org.opensearch.timeseries.stats.StatNames.AD_EXECUTING_BATCH_TASK_COUNT; +import static org.opensearch.timeseries.util.ParseUtils.isNullOrEmpty; import java.time.Clock; import java.time.Instant; @@ -51,35 +49,21 @@ import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; import org.opensearch.action.support.ThreadedActionListener; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.caching.PriorityTracker; import org.opensearch.ad.cluster.HashRing; -import org.opensearch.ad.common.exception.ADTaskCancelledException; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.common.exception.LimitExceededException; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.feature.FeatureManager; -import org.opensearch.ad.feature.SearchFeatureDao; import org.opensearch.ad.feature.SinglePointFeatures; import org.opensearch.ad.indices.ADIndex; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.ml.ModelManager; import org.opensearch.ad.model.ADTask; -import org.opensearch.ad.model.ADTaskState; import org.opensearch.ad.model.ADTaskType; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.model.DetectionDateRange; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.FeatureData; -import org.opensearch.ad.model.IntervalTimeConfiguration; -import org.opensearch.ad.rest.handler.AnomalyDetectorFunction; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.settings.EnabledSetting; import org.opensearch.ad.stats.ADStats; -import org.opensearch.ad.stats.StatNames; import org.opensearch.ad.transport.ADBatchAnomalyResultRequest; import org.opensearch.ad.transport.ADBatchAnomalyResultResponse; import org.opensearch.ad.transport.ADBatchTaskRemoteExecutionAction; @@ -87,9 +71,6 @@ import org.opensearch.ad.transport.ADStatsNodesAction; import org.opensearch.ad.transport.ADStatsRequest; import org.opensearch.ad.transport.handler.AnomalyResultBulkIndexHandler; -import org.opensearch.ad.util.ExceptionUtil; -import org.opensearch.ad.util.ParseUtils; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.service.ClusterService; @@ -109,6 +90,25 @@ import org.opensearch.search.aggregations.metrics.InternalMin; import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.common.exception.TaskCancelledException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.FeatureData; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.TaskState; +import org.opensearch.timeseries.stats.StatNames; +import org.opensearch.timeseries.util.ExceptionUtil; +import org.opensearch.timeseries.util.ParseUtils; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.TransportRequestOptions; import org.opensearch.transport.TransportService; @@ -129,10 +129,10 @@ public class ADBatchTaskRunner { private final ADStats adStats; private final ClusterService clusterService; private final FeatureManager featureManager; - private final ADCircuitBreakerService adCircuitBreakerService; + private final CircuitBreakerService adCircuitBreakerService; private final ADTaskManager adTaskManager; private final AnomalyResultBulkIndexHandler anomalyResultBulkIndexHandler; - private final AnomalyDetectionIndices anomalyDetectionIndices; + private final ADIndexManagement anomalyDetectionIndices; private final SearchFeatureDao searchFeatureDao; private final ADTaskCacheManager adTaskCacheManager; @@ -155,10 +155,10 @@ public ADBatchTaskRunner( ClusterService clusterService, Client client, SecurityClientUtil clientUtil, - ADCircuitBreakerService adCircuitBreakerService, + CircuitBreakerService adCircuitBreakerService, FeatureManager featureManager, ADTaskManager adTaskManager, - AnomalyDetectionIndices anomalyDetectionIndices, + ADIndexManagement anomalyDetectionIndices, ADStats adStats, AnomalyResultBulkIndexHandler anomalyResultBulkIndexHandler, ADTaskCacheManager adTaskCacheManager, @@ -181,7 +181,7 @@ public ADBatchTaskRunner( this.option = TransportRequestOptions .builder() .withType(TransportRequestOptions.Type.REG) - .withTimeout(AnomalyDetectorSettings.REQUEST_TIMEOUT.get(settings)) + .withTimeout(AnomalyDetectorSettings.AD_REQUEST_TIMEOUT.get(settings)) .build(); this.adTaskCacheManager = adTaskCacheManager; @@ -219,8 +219,8 @@ public ADBatchTaskRunner( * @param listener action listener */ public void run(ADTask adTask, TransportService transportService, ActionListener listener) { - boolean isHCDetector = adTask.getDetector().isMultientityDetector(); - if (isHCDetector && !adTaskCacheManager.topEntityInited(adTask.getDetectorId())) { + boolean isHCDetector = adTask.getDetector().isHighCardinality(); + if (isHCDetector && !adTaskCacheManager.topEntityInited(adTask.getConfigId())) { // Initialize top entities for HC detector threadPool.executor(AD_BATCH_TASK_THREAD_POOL_NAME).execute(() -> { ActionListener hcDelegatedListener = getInternalHCDelegatedListener(adTask); @@ -262,7 +262,7 @@ private ActionListener getTopEntitiesListener( ActionListener listener ) { String taskId = adTask.getTaskId(); - String detectorId = adTask.getDetectorId(); + String detectorId = adTask.getConfigId(); ActionListener actionListener = ActionListener.wrap(response -> { adTaskCacheManager.setTopEntityInited(detectorId); int totalEntities = adTaskCacheManager.getPendingEntityCount(detectorId); @@ -325,11 +325,11 @@ public void getTopEntities(ADTask adTask, ActionListener internalHCListe getDateRangeOfSourceData(adTask, (dataStartTime, dataEndTime) -> { PriorityTracker priorityTracker = new PriorityTracker( Clock.systemUTC(), - adTask.getDetector().getDetectorIntervalInSeconds(), + adTask.getDetector().getIntervalInSeconds(), adTask.getDetectionDateRange().getStartTime().toEpochMilli(), MAX_TOP_ENTITIES_LIMIT_FOR_HISTORICAL_ANALYSIS ); - long detectorInterval = adTask.getDetector().getDetectorIntervalInMilliseconds(); + long detectorInterval = adTask.getDetector().getIntervalInMilliseconds(); logger .debug( "start to search top entities at {}, data start time: {}, data end time: {}, interval: {}", @@ -338,7 +338,7 @@ public void getTopEntities(ADTask adTask, ActionListener internalHCListe dataEndTime, detectorInterval ); - if (adTask.getDetector().isMultiCategoryDetector()) { + if (adTask.getDetector().hasMultipleCategories()) { searchTopEntitiesForMultiCategoryHC( adTask, priorityTracker, @@ -390,19 +390,19 @@ private void searchTopEntitiesForMultiCategoryHC( logger.debug("finish searching top entities at " + System.currentTimeMillis()); List topNEntities = priorityTracker.getTopNEntities(maxTopEntitiesPerHcDetector); if (topNEntities.size() == 0) { - logger.error("There is no entity found for detector " + adTask.getDetectorId()); - internalHCListener.onFailure(new ResourceNotFoundException(adTask.getDetectorId(), "No entity found")); + logger.error("There is no entity found for detector " + adTask.getConfigId()); + internalHCListener.onFailure(new ResourceNotFoundException(adTask.getConfigId(), "No entity found")); return; } - adTaskCacheManager.addPendingEntities(adTask.getDetectorId(), topNEntities); - adTaskCacheManager.setTopEntityCount(adTask.getDetectorId(), topNEntities.size()); + adTaskCacheManager.addPendingEntities(adTask.getConfigId(), topNEntities); + adTaskCacheManager.setTopEntityCount(adTask.getConfigId(), topNEntities.size()); internalHCListener.onResponse("Get top entities done"); } }, e -> { - logger.error("Failed to get top entities for detector " + adTask.getDetectorId(), e); + logger.error("Failed to get top entities for detector " + adTask.getConfigId(), e); internalHCListener.onFailure(e); }); - int minimumDocCount = Math.max((int) (bucketInterval / adTask.getDetector().getDetectorIntervalInMilliseconds()) / 2, 1); + int minimumDocCount = Math.max((int) (bucketInterval / adTask.getDetector().getIntervalInMilliseconds()) / 2, 1); searchFeatureDao .getHighestCountEntities( adTask.getDetector(), @@ -437,7 +437,7 @@ private void searchTopEntitiesForSingleCategoryHC( String topEntitiesAgg = "topEntities"; AggregationBuilder aggregation = new TermsAggregationBuilder(topEntitiesAgg) - .field(adTask.getDetector().getCategoryField().get(0)) + .field(adTask.getDetector().getCategoryFields().get(0)) .size(MAX_TOP_ENTITIES_LIMIT_FOR_HISTORICAL_ANALYSIS); sourceBuilder.aggregation(aggregation).size(0); SearchRequest searchRequest = new SearchRequest(); @@ -467,16 +467,16 @@ private void searchTopEntitiesForSingleCategoryHC( logger.debug("finish searching top entities at " + System.currentTimeMillis()); List topNEntities = priorityTracker.getTopNEntities(maxTopEntitiesPerHcDetector); if (topNEntities.size() == 0) { - logger.error("There is no entity found for detector " + adTask.getDetectorId()); - internalHCListener.onFailure(new ResourceNotFoundException(adTask.getDetectorId(), "No entity found")); + logger.error("There is no entity found for detector " + adTask.getConfigId()); + internalHCListener.onFailure(new ResourceNotFoundException(adTask.getConfigId(), "No entity found")); return; } - adTaskCacheManager.addPendingEntities(adTask.getDetectorId(), topNEntities); - adTaskCacheManager.setTopEntityCount(adTask.getDetectorId(), topNEntities.size()); + adTaskCacheManager.addPendingEntities(adTask.getConfigId(), topNEntities); + adTaskCacheManager.setTopEntityCount(adTask.getConfigId(), topNEntities.size()); internalHCListener.onResponse("Get top entities done"); } }, e -> { - logger.error("Failed to get top entities for detector " + adTask.getDetectorId(), e); + logger.error("Failed to get top entities for detector " + adTask.getConfigId(), e); internalHCListener.onFailure(e); }); // using the original context in listener as user roles have no permissions for internal operations like fetching a @@ -488,6 +488,7 @@ private void searchTopEntitiesForSingleCategoryHC( // user is the one who started historical detector. Read AnomalyDetectorJobTransportAction.doExecute. adTask.getUser(), client, + AnalysisType.AD, searchResponseListener ); } @@ -511,9 +512,9 @@ public void forwardOrExecuteADTask( ) { try { checkIfADTaskCancelledAndCleanupCache(adTask); - String detectorId = adTask.getDetectorId(); + String detectorId = adTask.getConfigId(); AnomalyDetector detector = adTask.getDetector(); - boolean isHCDetector = detector.isMultientityDetector(); + boolean isHCDetector = detector.isHighCardinality(); if (isHCDetector) { String entityString = adTaskCacheManager.pollEntity(detectorId); logger.debug("Start to run entity: {} of detector {}", entityString, detectorId); @@ -560,14 +561,14 @@ public void forwardOrExecuteADTask( logger.info("Create entity task for entity:{}", entityString); Instant now = Instant.now(); ADTask adEntityTask = new ADTask.Builder() - .detectorId(adTask.getDetectorId()) + .configId(adTask.getConfigId()) .detector(detector) .isLatest(true) .taskType(ADTaskType.HISTORICAL_HC_ENTITY.name()) .executionStartTime(now) .taskProgress(0.0f) .initProgress(0.0f) - .state(ADTaskState.INIT.name()) + .state(TaskState.INIT.name()) .initProgress(0.0f) .lastUpdateTime(now) .startedBy(adTask.getStartedBy()) @@ -594,7 +595,7 @@ public void forwardOrExecuteADTask( ); } else { Map updatedFields = new HashMap<>(); - updatedFields.put(STATE_FIELD, ADTaskState.INIT.name()); + updatedFields.put(STATE_FIELD, TaskState.INIT.name()); updatedFields.put(INIT_PROGRESS_FIELD, 0.0f); ActionListener workerNodeResponseListener = workerNodeResponseListener( adTask, @@ -636,7 +637,7 @@ private ActionListener workerNodeResponseListener( if (adTask.isEntityTask()) { // When reach this line, the entity task already been put into worker node's cache. // Then it's safe to move entity from temp entities queue to running entities queue. - adTaskCacheManager.moveToRunningEntity(adTask.getDetectorId(), adTaskManager.convertEntityToString(adTask)); + adTaskCacheManager.moveToRunningEntity(adTask.getConfigId(), adTaskManager.convertEntityToString(adTask)); } startNewEntityTaskLane(adTask, transportService); }, e -> { @@ -644,10 +645,10 @@ private ActionListener workerNodeResponseListener( listener.onFailure(e); handleException(adTask, e); - if (adTask.getDetector().isMultientityDetector()) { + if (adTask.getDetector().isHighCardinality()) { // Entity task done on worker node. Send entity task done message to coordinating node to poll next entity. adTaskManager.entityTaskDone(adTask, e, transportService); - if (adTaskCacheManager.getAvailableNewEntityTaskLanes(adTask.getDetectorId()) > 0) { + if (adTaskCacheManager.getAvailableNewEntityTaskLanes(adTask.getConfigId()) > 0) { // When reach this line, it means entity task failed to start on worker node // Sleep some time before starting new task lane. threadPool @@ -696,8 +697,8 @@ private void forwardOrExecuteEntityTask( // start new entity task lane private synchronized void startNewEntityTaskLane(ADTask adTask, TransportService transportService) { - if (adTask.getDetector().isMultientityDetector() && adTaskCacheManager.getAndDecreaseEntityTaskLanes(adTask.getDetectorId()) > 0) { - logger.debug("start new task lane for detector {}", adTask.getDetectorId()); + if (adTask.getDetector().isHighCardinality() && adTaskCacheManager.getAndDecreaseEntityTaskLanes(adTask.getConfigId()) > 0) { + logger.debug("start new task lane for detector {}", adTask.getConfigId()); forwardOrExecuteADTask(adTask, transportService, getInternalHCDelegatedListener(adTask)); } } @@ -719,10 +720,10 @@ private void dispatchTask(ADTask adTask, ActionListener listener) .append(DEFAULT_JVM_HEAP_USAGE_THRESHOLD) .append("%. ") .append(NO_ELIGIBLE_NODE_TO_RUN_DETECTOR) - .append(adTask.getDetectorId()); + .append(adTask.getConfigId()); String errorMessage = errorMessageBuilder.toString(); logger.warn(errorMessage + ", task id " + adTask.getTaskId() + ", " + adTask.getTaskType()); - listener.onFailure(new LimitExceededException(adTask.getDetectorId(), errorMessage)); + listener.onFailure(new LimitExceededException(adTask.getConfigId(), errorMessage)); return; } candidateNodeResponse = candidateNodeResponse @@ -732,10 +733,10 @@ private void dispatchTask(ADTask adTask, ActionListener listener) if (candidateNodeResponse.size() == 0) { StringBuilder errorMessageBuilder = new StringBuilder("All nodes' executing batch tasks exceeds limitation ") .append(NO_ELIGIBLE_NODE_TO_RUN_DETECTOR) - .append(adTask.getDetectorId()); + .append(adTask.getConfigId()); String errorMessage = errorMessageBuilder.toString(); logger.warn(errorMessage + ", task id " + adTask.getTaskId() + ", " + adTask.getTaskType()); - listener.onFailure(new LimitExceededException(adTask.getDetectorId(), errorMessage)); + listener.onFailure(new LimitExceededException(adTask.getConfigId(), errorMessage)); return; } Optional targetNode = candidateNodeResponse @@ -795,30 +796,30 @@ public void startADBatchTaskOnWorkerNode( private ActionListener internalBatchTaskListener(ADTask adTask, TransportService transportService) { String taskId = adTask.getTaskId(); - String detectorTaskId = adTask.getDetectorLevelTaskId(); - String detectorId = adTask.getDetectorId(); + String detectorTaskId = adTask.getConfigLevelTaskId(); + String detectorId = adTask.getConfigId(); ActionListener listener = ActionListener.wrap(response -> { // If batch task finished normally, remove task from cache and decrease executing task count by 1. adTaskCacheManager.remove(taskId, detectorId, detectorTaskId); adStats.getStat(AD_EXECUTING_BATCH_TASK_COUNT.getName()).decrement(); - if (!adTask.getDetector().isMultientityDetector()) { + if (!adTask.getDetector().isHighCardinality()) { // Set single-entity detector task as FINISHED here adTaskManager .cleanDetectorCache( adTask, transportService, - () -> adTaskManager.updateADTask(taskId, ImmutableMap.of(STATE_FIELD, ADTaskState.FINISHED.name())) + () -> adTaskManager.updateADTask(taskId, ImmutableMap.of(STATE_FIELD, TaskState.FINISHED.name())) ); } else { // Set entity task as FINISHED here - adTaskManager.updateADTask(adTask.getTaskId(), ImmutableMap.of(STATE_FIELD, ADTaskState.FINISHED.name())); + adTaskManager.updateADTask(adTask.getTaskId(), ImmutableMap.of(STATE_FIELD, TaskState.FINISHED.name())); adTaskManager.entityTaskDone(adTask, null, transportService); } }, e -> { // If batch task failed, remove task from cache and decrease executing task count by 1. adTaskCacheManager.remove(taskId, detectorId, detectorTaskId); adStats.getStat(AD_EXECUTING_BATCH_TASK_COUNT.getName()).decrement(); - if (!adTask.getDetector().isMultientityDetector()) { + if (!adTask.getDetector().isHighCardinality()) { adTaskManager.cleanDetectorCache(adTask, transportService, () -> handleException(adTask, e)); } else { adTaskManager.entityTaskDone(adTask, e, transportService); @@ -838,7 +839,7 @@ private ActionListener internalBatchTaskListener(ADTask adTask, Transpor private void handleException(ADTask adTask, Exception e) { // Check if batch task was cancelled or not by exception type. // If it's cancelled, then increase cancelled task count by 1, otherwise increase failure count by 1. - if (e instanceof ADTaskCancelledException) { + if (e instanceof TaskCancelledException) { adStats.getStat(StatNames.AD_CANCELED_BATCH_TASK_COUNT.getName()).increment(); } else if (ExceptionUtil.countInStats(e)) { adStats.getStat(StatNames.AD_BATCH_TASK_FAILURE_COUNT.getName()).increment(); @@ -863,15 +864,15 @@ private void executeADBatchTaskOnWorkerNode(ADTask adTask, ActionListener { - long interval = ((IntervalTimeConfiguration) adTask.getDetector().getDetectionInterval()) - .toDuration() - .toMillis(); + long interval = ((IntervalTimeConfiguration) adTask.getDetector().getInterval()).toDuration().toMillis(); long expectedPieceEndTime = dataStartTime + pieceSize * interval; long firstPieceEndTime = Math.min(expectedPieceEndTime, dataEndTime); logger @@ -920,7 +919,7 @@ private void runFirstPiece(ADTask adTask, Instant executeStartTime, ActionListen interval, dataStartTime, dataEndTime, - adTask.getDetectorId(), + adTask.getConfigId(), adTask.getTaskId() ); getFeatureData( @@ -947,8 +946,8 @@ private void runFirstPiece(ADTask adTask, Instant executeStartTime, ActionListen private void getDateRangeOfSourceData(ADTask adTask, BiConsumer consumer, ActionListener internalListener) { String taskId = adTask.getTaskId(); SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder() - .aggregation(AggregationBuilders.min(AGG_NAME_MIN_TIME).field(adTask.getDetector().getTimeField())) - .aggregation(AggregationBuilders.max(AGG_NAME_MAX_TIME).field(adTask.getDetector().getTimeField())) + .aggregation(AggregationBuilders.min(CommonName.AGG_NAME_MIN_TIME).field(adTask.getDetector().getTimeField())) + .aggregation(AggregationBuilders.max(CommonName.AGG_NAME_MAX_TIME).field(adTask.getDetector().getTimeField())) .size(0); if (adTask.getEntity() != null && adTask.getEntity().getAttributes().size() > 0) { BoolQueryBuilder query = new BoolQueryBuilder(); @@ -964,18 +963,18 @@ private void getDateRangeOfSourceData(ADTask adTask, BiConsumer cons .indices(adTask.getDetector().getIndices().toArray(new String[0])) .source(searchSourceBuilder); final ActionListener searchResponseListener = ActionListener.wrap(r -> { - InternalMin minAgg = r.getAggregations().get(AGG_NAME_MIN_TIME); - InternalMax maxAgg = r.getAggregations().get(AGG_NAME_MAX_TIME); + InternalMin minAgg = r.getAggregations().get(CommonName.AGG_NAME_MIN_TIME); + InternalMax maxAgg = r.getAggregations().get(CommonName.AGG_NAME_MAX_TIME); double minValue = minAgg.getValue(); double maxValue = maxAgg.getValue(); // If time field not exist or there is no value, will return infinity value if (minValue == Double.POSITIVE_INFINITY) { - internalListener.onFailure(new ResourceNotFoundException(adTask.getDetectorId(), "There is no data in the time field")); + internalListener.onFailure(new ResourceNotFoundException(adTask.getConfigId(), "There is no data in the time field")); return; } - long interval = ((IntervalTimeConfiguration) adTask.getDetector().getDetectionInterval()).toDuration().toMillis(); + long interval = ((IntervalTimeConfiguration) adTask.getDetector().getInterval()).toDuration().toMillis(); - DetectionDateRange detectionDateRange = adTask.getDetectionDateRange(); + DateRange detectionDateRange = adTask.getDetectionDateRange(); long dataStartTime = detectionDateRange.getStartTime().toEpochMilli(); long dataEndTime = detectionDateRange.getEndTime().toEpochMilli(); long minDate = (long) minValue; @@ -983,7 +982,7 @@ private void getDateRangeOfSourceData(ADTask adTask, BiConsumer cons if (minDate >= dataEndTime || maxDate <= dataStartTime) { internalListener - .onFailure(new ResourceNotFoundException(adTask.getDetectorId(), "There is no data in the detection date range")); + .onFailure(new ResourceNotFoundException(adTask.getConfigId(), "There is no data in the detection date range")); return; } if (minDate > dataStartTime) { @@ -998,7 +997,7 @@ private void getDateRangeOfSourceData(ADTask adTask, BiConsumer cons dataEndTime = dataEndTime - dataEndTime % interval; logger.debug("adjusted date range: start: {}, end: {}, taskId: {}", dataStartTime, dataEndTime, taskId); if ((dataEndTime - dataStartTime) < NUM_MIN_SAMPLES * interval) { - internalListener.onFailure(new AnomalyDetectionException("There is not enough data to train model").countedInStats(false)); + internalListener.onFailure(new TimeSeriesException("There is not enough data to train model").countedInStats(false)); return; } consumer.accept(dataStartTime, dataEndTime); @@ -1012,6 +1011,7 @@ private void getDateRangeOfSourceData(ADTask adTask, BiConsumer cons // user is the one who started historical detector. Read AnomalyDetectorJobTransportAction.doExecute. adTask.getUser(), client, + AnalysisType.AD, searchResponseListener ); } @@ -1098,15 +1098,15 @@ private void detectAnomaly( ? "No full shingle in current detection window" : "No data in current detection window"; AnomalyResult anomalyResult = new AnomalyResult( - adTask.getDetectorId(), - adTask.getDetectorLevelTaskId(), + adTask.getConfigId(), + adTask.getConfigLevelTaskId(), featureData, Instant.ofEpochMilli(intervalEndTime - interval), Instant.ofEpochMilli(intervalEndTime), executeStartTime, Instant.now(), error, - adTask.getEntity(), + Optional.ofNullable(adTask.getEntity()), adTask.getDetector().getUser(), anomalyDetectionIndices.getSchemaVersion(ADIndex.RESULT), adTask.getEntityModelId() @@ -1124,9 +1124,9 @@ private void detectAnomaly( AnomalyResult anomalyResult = AnomalyResult .fromRawTRCFResult( - adTask.getDetectorId(), - adTask.getDetector().getDetectorIntervalInMilliseconds(), - adTask.getDetectorLevelTaskId(), + adTask.getConfigId(), + adTask.getDetector().getIntervalInMilliseconds(), + adTask.getConfigLevelTaskId(), score, descriptor.getAnomalyGrade(), descriptor.getDataConfidence(), @@ -1136,7 +1136,7 @@ private void detectAnomaly( executeStartTime, Instant.now(), null, - adTask.getEntity(), + Optional.ofNullable(adTask.getEntity()), adTask.getDetector().getUser(), anomalyDetectionIndices.getSchemaVersion(ADIndex.RESULT), adTask.getEntityModelId(), @@ -1163,7 +1163,7 @@ private void detectAnomaly( user = adTask.getUser().getName(); roles = adTask.getUser().getRoles(); } - String resultIndex = adTask.getDetector().getResultIndex(); + String resultIndex = adTask.getDetector().getCustomResultIndex(); if (resultIndex == null) { // if result index is null, store anomaly result directly @@ -1246,14 +1246,14 @@ private void runNextPiece( ActionListener internalListener ) { String taskId = adTask.getTaskId(); - String detectorId = adTask.getDetectorId(); - String detectorTaskId = adTask.getDetectorLevelTaskId(); + String detectorId = adTask.getConfigId(); + String detectorTaskId = adTask.getConfigLevelTaskId(); float initProgress = calculateInitProgress(taskId); - String taskState = initProgress >= 1.0f ? ADTaskState.RUNNING.name() : ADTaskState.INIT.name(); + String taskState = initProgress >= 1.0f ? TaskState.RUNNING.name() : TaskState.INIT.name(); logger.debug("Init progress: {}, taskState:{}, task id: {}", initProgress, taskState, taskId); if (initProgress >= 1.0f && adTask.isEntityTask()) { - updateDetectorLevelTaskState(detectorId, adTask.getParentTaskId(), ADTaskState.RUNNING.name()); + updateDetectorLevelTaskState(detectorId, adTask.getParentTaskId(), TaskState.RUNNING.name()); } if (pieceStartTime < dataEndTime) { @@ -1319,7 +1319,7 @@ private void runNextPiece( INIT_PROGRESS_FIELD, initProgress, STATE_FIELD, - ADTaskState.FINISHED + TaskState.FINISHED ), ActionListener.wrap(r -> internalListener.onResponse("task execution done"), e -> internalListener.onFailure(e)) ); @@ -1327,7 +1327,7 @@ private void runNextPiece( } private void updateDetectorLevelTaskState(String detectorId, String detectorTaskId, String newState) { - AnomalyDetectorFunction function = () -> adTaskManager + ExecutorFunction function = () -> adTaskManager .updateADTask(detectorTaskId, ImmutableMap.of(STATE_FIELD, newState), ActionListener.wrap(r -> { logger.info("Updated HC detector task: {} state as: {} for detector: {}", detectorTaskId, newState, detectorId); adTaskCacheManager.updateDetectorTaskState(detectorId, detectorTaskId, newState); @@ -1359,17 +1359,17 @@ private float calculateInitProgress(String taskId) { private void checkIfADTaskCancelledAndCleanupCache(ADTask adTask) { String taskId = adTask.getTaskId(); - String detectorId = adTask.getDetectorId(); - String detectorTaskId = adTask.getDetectorLevelTaskId(); + String detectorId = adTask.getConfigId(); + String detectorTaskId = adTask.getConfigLevelTaskId(); // refresh latest HC task run time adTaskCacheManager.refreshLatestHCTaskRunTime(detectorId); - if (adTask.getDetector().isMultientityDetector() + if (adTask.getDetector().isHighCardinality() && adTaskCacheManager.isHCTaskCoordinatingNode(detectorId) && adTaskCacheManager.isHistoricalAnalysisCancelledForHC(detectorId, detectorTaskId)) { // clean up pending and running entity on coordinating node adTaskCacheManager.clearPendingEntities(detectorId); adTaskCacheManager.removeRunningEntity(detectorId, adTaskManager.convertEntityToString(adTask)); - throw new ADTaskCancelledException( + throw new TaskCancelledException( adTaskCacheManager.getCancelReasonForHC(detectorId, detectorTaskId), adTaskCacheManager.getCancelledByForHC(detectorId, detectorTaskId) ); @@ -1387,7 +1387,7 @@ && isNullOrEmpty(adTaskCacheManager.getTasksOfDetector(detectorId))) { adTaskCacheManager.removeHistoricalTaskCache(detectorId); } - throw new ADTaskCancelledException(cancelReason, cancelledBy); + throw new TaskCancelledException(cancelReason, cancelledBy); } } diff --git a/src/main/java/org/opensearch/ad/task/ADHCBatchTaskRunState.java b/src/main/java/org/opensearch/ad/task/ADHCBatchTaskRunState.java index 91f00b4cd..7f4c70f81 100644 --- a/src/main/java/org/opensearch/ad/task/ADHCBatchTaskRunState.java +++ b/src/main/java/org/opensearch/ad/task/ADHCBatchTaskRunState.java @@ -13,7 +13,7 @@ import java.time.Instant; -import org.opensearch.ad.model.ADTaskState; +import org.opensearch.timeseries.model.TaskState; /** * Cache HC batch task running state on coordinating and worker node. @@ -32,7 +32,7 @@ public class ADHCBatchTaskRunState { private Long cancelledTimeInMillis; public ADHCBatchTaskRunState() { - this.detectorTaskState = ADTaskState.INIT.name(); + this.detectorTaskState = TaskState.INIT.name(); } public String getDetectorTaskState() { diff --git a/src/main/java/org/opensearch/ad/task/ADTaskCacheManager.java b/src/main/java/org/opensearch/ad/task/ADTaskCacheManager.java index 6e5d7a5ed..014a9f798 100644 --- a/src/main/java/org/opensearch/ad/task/ADTaskCacheManager.java +++ b/src/main/java/org/opensearch/ad/task/ADTaskCacheManager.java @@ -11,13 +11,10 @@ package org.opensearch.ad.task; -import static org.opensearch.ad.MemoryTracker.Origin.HISTORICAL_SINGLE_ENTITY_DETECTOR; -import static org.opensearch.ad.constant.CommonErrorMessages.DETECTOR_IS_RUNNING; -import static org.opensearch.ad.constant.CommonErrorMessages.EXCEED_HISTORICAL_ANALYSIS_LIMIT; +import static org.opensearch.ad.constant.ADCommonMessages.DETECTOR_IS_RUNNING; +import static org.opensearch.ad.constant.ADCommonMessages.EXCEED_HISTORICAL_ANALYSIS_LIMIT; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_CACHED_DELETED_TASKS; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.NUM_TREES; -import static org.opensearch.ad.util.ParseUtils.isNullOrEmpty; +import static org.opensearch.timeseries.MemoryTracker.Origin.HISTORICAL_SINGLE_ENTITY_DETECTOR; import java.time.Instant; import java.util.ArrayList; @@ -27,37 +24,33 @@ import java.util.Map; import java.util.Objects; import java.util.Optional; -import java.util.Queue; import java.util.concurrent.ConcurrentHashMap; -import java.util.concurrent.ConcurrentLinkedQueue; import java.util.concurrent.Semaphore; import java.util.stream.Collectors; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.MemoryTracker; -import org.opensearch.ad.common.exception.DuplicateTaskException; -import org.opensearch.ad.common.exception.LimitExceededException; import org.opensearch.ad.model.ADTask; -import org.opensearch.ad.model.ADTaskState; import org.opensearch.ad.model.ADTaskType; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Settings; -import org.opensearch.core.action.ActionListener; -import org.opensearch.transport.TransportService; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.common.exception.DuplicateTaskException; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.task.TaskCacheManager; +import org.opensearch.timeseries.util.ParseUtils; import com.amazon.randomcutforest.RandomCutForest; import com.amazon.randomcutforest.parkservices.ThresholdedRandomCutForest; import com.google.common.collect.ImmutableList; -public class ADTaskCacheManager { +public class ADTaskCacheManager extends TaskCacheManager { private final Logger logger = LogManager.getLogger(ADTaskCacheManager.class); private volatile Integer maxAdBatchTaskPerNode; - private volatile Integer maxCachedDeletedTask; private final MemoryTracker memoryTracker; private final int numberSize = 8; public static final int TASK_RETRY_LIMIT = 3; @@ -89,19 +82,6 @@ public class ADTaskCacheManager { *

Key: detector id

*/ private Map detectorTaskSlotLimit; - /** - * This field is to cache all realtime tasks on coordinating node. - *

Node: coordinating node

- *

Key is detector id

- */ - private Map realtimeTaskCaches; - /** - * This field is to cache all deleted detector level tasks on coordinating node. - * Will try to clean up child task and AD result later. - *

Node: coordinating node

- * Check {@link ADTaskManager#cleanChildTasksAndADResultsOfDeletedTask()} - */ - private Queue deletedDetectorTasks; // =================================================================== // Fields below are caches on worker node @@ -126,17 +106,6 @@ public class ADTaskCacheManager { */ private Map> hcBatchTaskRunState; - // =================================================================== - // Fields below are caches on any data node serves delete detector - // request. Check ADTaskManager#deleteADResultOfDetector - // =================================================================== - /** - * This field is to cache deleted detector IDs. Hourly cron will poll this queue - * and clean AD results. Check {@link ADTaskManager#cleanADResultOfDeletedDetector()} - *

Node: any data node servers delete detector request

- */ - private Queue deletedDetectors; - /** * Constructor to create AD task cache manager. * @@ -145,17 +114,14 @@ public class ADTaskCacheManager { * @param memoryTracker AD memory tracker */ public ADTaskCacheManager(Settings settings, ClusterService clusterService, MemoryTracker memoryTracker) { + super(settings, clusterService); this.maxAdBatchTaskPerNode = MAX_BATCH_TASK_PER_NODE.get(settings); clusterService.getClusterSettings().addSettingsUpdateConsumer(MAX_BATCH_TASK_PER_NODE, it -> maxAdBatchTaskPerNode = it); - this.maxCachedDeletedTask = MAX_CACHED_DELETED_TASKS.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(MAX_CACHED_DELETED_TASKS, it -> maxCachedDeletedTask = it); this.batchTaskCaches = new ConcurrentHashMap<>(); this.memoryTracker = memoryTracker; this.detectorTasks = new ConcurrentHashMap<>(); this.hcBatchTaskCaches = new ConcurrentHashMap<>(); - this.realtimeTaskCaches = new ConcurrentHashMap<>(); - this.deletedDetectorTasks = new ConcurrentLinkedQueue<>(); - this.deletedDetectors = new ConcurrentLinkedQueue<>(); + this.detectorTaskSlotLimit = new ConcurrentHashMap<>(); this.hcBatchTaskRunState = new ConcurrentHashMap<>(); this.cleanExpiredHCBatchTaskRunStatesSemaphore = new Semaphore(1); @@ -171,7 +137,7 @@ public ADTaskCacheManager(Settings settings, ClusterService clusterService, Memo */ public synchronized void add(ADTask adTask) { String taskId = adTask.getTaskId(); - String detectorId = adTask.getDetectorId(); + String detectorId = adTask.getConfigId(); if (contains(taskId)) { throw new DuplicateTaskException(DETECTOR_IS_RUNNING); } @@ -189,7 +155,7 @@ public synchronized void add(ADTask adTask) { taskCache.getCacheMemorySize().set(neededCacheSize); batchTaskCaches.put(taskId, taskCache); if (adTask.isEntityTask()) { - ADHCBatchTaskRunState hcBatchTaskRunState = getHCBatchTaskRunState(detectorId, adTask.getDetectorLevelTaskId()); + ADHCBatchTaskRunState hcBatchTaskRunState = getHCBatchTaskRunState(detectorId, adTask.getConfigLevelTaskId()); if (hcBatchTaskRunState != null) { hcBatchTaskRunState.setLastTaskRunTimeInMillis(Instant.now().toEpochMilli()); } @@ -303,7 +269,7 @@ public boolean contains(String taskId) { * @return true if there is task in cache; otherwise return false */ public boolean containsTaskOfDetector(String detectorId) { - return batchTaskCaches.values().stream().filter(v -> Objects.equals(detectorId, v.getDetectorId())).findAny().isPresent(); + return batchTaskCaches.values().stream().filter(v -> Objects.equals(detectorId, v.getId())).findAny().isPresent(); } /** @@ -316,7 +282,7 @@ public List getTasksOfDetector(String detectorId) { return batchTaskCaches .values() .stream() - .filter(v -> Objects.equals(detectorId, v.getDetectorId())) + .filter(v -> Objects.equals(detectorId, v.getId())) .map(c -> c.getTaskId()) .collect(Collectors.toList()); } @@ -339,7 +305,7 @@ private ADBatchTaskCache getBatchTaskCache(String taskId) { } private List getBatchTaskCacheByDetectorId(String detectorId) { - return batchTaskCaches.values().stream().filter(v -> Objects.equals(detectorId, v.getDetectorId())).collect(Collectors.toList()); + return batchTaskCaches.values().stream().filter(v -> Objects.equals(detectorId, v.getId())).collect(Collectors.toList()); } /** @@ -354,8 +320,8 @@ private long calculateADTaskCacheSize(ADTask adTask) { return memoryTracker .estimateTRCFModelSize( dimension, - NUM_TREES, - AnomalyDetectorSettings.BATCH_BOUNDING_BOX_CACHE_RATIO, + TimeSeriesSettings.NUM_TREES, + TimeSeriesSettings.BATCH_BOUNDING_BOX_CACHE_RATIO, detector.getShingleSize().intValue(), false ) + shingleMemorySize(detector.getShingleSize(), detector.getEnabledFeatureIds().size()); @@ -373,8 +339,7 @@ public long getModelSize(String taskId) { RandomCutForest rcfForest = tRCF.getForest(); int dimensions = rcfForest.getDimensions(); int numberOfTrees = rcfForest.getNumberOfTrees(); - return memoryTracker - .estimateTRCFModelSize(dimensions, numberOfTrees, AnomalyDetectorSettings.BATCH_BOUNDING_BOX_CACHE_RATIO, 1, false); + return memoryTracker.estimateTRCFModelSize(dimensions, numberOfTrees, TimeSeriesSettings.BATCH_BOUNDING_BOX_CACHE_RATIO, 1, false); } /** @@ -483,7 +448,7 @@ public ADTaskCancellationState cancelByDetectorId(String detectorId, String dete taskStateCache.setCancelReason(reason); taskStateCache.setCancelledBy(userName); - if (isNullOrEmpty(taskCaches)) { + if (ParseUtils.isNullOrEmpty(taskCaches)) { return ADTaskCancellationState.NOT_FOUND; } @@ -506,7 +471,7 @@ public ADTaskCancellationState cancelByDetectorId(String detectorId, String dete public boolean isCancelled(String taskId) { // For HC detector, ADBatchTaskCache is entity task. ADBatchTaskCache taskCache = getBatchTaskCache(taskId); - String detectorId = taskCache.getDetectorId(); + String detectorId = taskCache.getId(); String detectorTaskId = taskCache.getDetectorTaskId(); ADHCBatchTaskRunState taskStateCache = getHCBatchTaskRunState(detectorId, detectorTaskId); @@ -810,7 +775,7 @@ public boolean isHCTaskRunning(String detectorId) { Optional entityTask = this.batchTaskCaches .values() .stream() - .filter(cache -> Objects.equals(detectorId, cache.getDetectorId()) && cache.getEntity() != null) + .filter(cache -> Objects.equals(detectorId, cache.getId()) && cache.getEntity() != null) .findFirst(); return entityTask.isPresent(); } @@ -1012,174 +977,6 @@ public void clearPendingEntities(String detectorId) { } } - /** - * Check if realtime task field value change needed or not by comparing with cache. - * 1. If new field value is null, will consider changed needed to this field. - * 2. will consider the real time task change needed if - * 1) init progress is larger or the old init progress is null, or - * 2) if the state is different, and it is not changing from running to init. - * for other fields, as long as field values changed, will consider the realtime - * task change needed. We did this so that the init progress or state won't go backwards. - * 3. If realtime task cache not found, will consider the realtime task change needed. - * - * @param detectorId detector id - * @param newState new task state - * @param newInitProgress new init progress - * @param newError new error - * @return true if realtime task change needed. - */ - public boolean isRealtimeTaskChangeNeeded(String detectorId, String newState, Float newInitProgress, String newError) { - if (realtimeTaskCaches.containsKey(detectorId)) { - ADRealtimeTaskCache realtimeTaskCache = realtimeTaskCaches.get(detectorId); - boolean stateChangeNeeded = false; - String oldState = realtimeTaskCache.getState(); - if (newState != null - && !newState.equals(oldState) - && !(ADTaskState.INIT.name().equals(newState) && ADTaskState.RUNNING.name().equals(oldState))) { - stateChangeNeeded = true; - } - boolean initProgressChangeNeeded = false; - Float existingProgress = realtimeTaskCache.getInitProgress(); - if (newInitProgress != null - && !newInitProgress.equals(existingProgress) - && (existingProgress == null || newInitProgress > existingProgress)) { - initProgressChangeNeeded = true; - } - boolean errorChanged = false; - if (newError != null && !newError.equals(realtimeTaskCache.getError())) { - errorChanged = true; - } - if (stateChangeNeeded || initProgressChangeNeeded || errorChanged) { - return true; - } - return false; - } else { - return true; - } - } - - /** - * Update realtime task cache with new field values. If realtime task cache exist, update it - * directly if task is not done; if task is done, remove the detector's realtime task cache. - * - * If realtime task cache doesn't exist, will do nothing. Next realtime job run will re-init - * realtime task cache when it finds task cache not inited yet. - * Check {@link ADTaskManager#initRealtimeTaskCacheAndCleanupStaleCache(String, AnomalyDetector, TransportService, ActionListener)}, - * {@link ADTaskManager#updateLatestRealtimeTaskOnCoordinatingNode(String, String, Long, Long, String, ActionListener)} - * - * @param detectorId detector id - * @param newState new task state - * @param newInitProgress new init progress - * @param newError new error - */ - public void updateRealtimeTaskCache(String detectorId, String newState, Float newInitProgress, String newError) { - ADRealtimeTaskCache realtimeTaskCache = realtimeTaskCaches.get(detectorId); - if (realtimeTaskCache != null) { - if (newState != null) { - realtimeTaskCache.setState(newState); - } - if (newInitProgress != null) { - realtimeTaskCache.setInitProgress(newInitProgress); - } - if (newError != null) { - realtimeTaskCache.setError(newError); - } - if (newState != null && !ADTaskState.NOT_ENDED_STATES.contains(newState)) { - // If task is done, will remove its realtime task cache. - logger.info("Realtime task done with state {}, remove RT task cache for detector ", newState, detectorId); - removeRealtimeTaskCache(detectorId); - } - } else { - logger.debug("Realtime task cache is not inited yet for detector {}", detectorId); - } - } - - public void initRealtimeTaskCache(String detectorId, long detectorIntervalInMillis) { - realtimeTaskCaches.put(detectorId, new ADRealtimeTaskCache(null, null, null, detectorIntervalInMillis)); - logger.debug("Realtime task cache inited"); - } - - public void refreshRealtimeJobRunTime(String detectorId) { - ADRealtimeTaskCache taskCache = realtimeTaskCaches.get(detectorId); - if (taskCache != null) { - taskCache.setLastJobRunTime(Instant.now().toEpochMilli()); - } - } - - /** - * Get detector IDs from realtime task cache. - * @return array of detector id - */ - public String[] getDetectorIdsInRealtimeTaskCache() { - return realtimeTaskCaches.keySet().toArray(new String[0]); - } - - /** - * Remove detector's realtime task from cache. - * @param detectorId detector id - */ - public void removeRealtimeTaskCache(String detectorId) { - if (realtimeTaskCaches.containsKey(detectorId)) { - logger.info("Delete realtime cache for detector {}", detectorId); - realtimeTaskCaches.remove(detectorId); - } - } - - public ADRealtimeTaskCache getRealtimeTaskCache(String detectorId) { - return realtimeTaskCaches.get(detectorId); - } - - /** - * Clear realtime task cache. - */ - public void clearRealtimeTaskCache() { - realtimeTaskCaches.clear(); - } - - /** - * Add deleted task's id to deleted detector tasks queue. - * @param taskId task id - */ - public void addDeletedDetectorTask(String taskId) { - if (deletedDetectorTasks.size() < maxCachedDeletedTask) { - deletedDetectorTasks.add(taskId); - } - } - - /** - * Check if deleted task queue has items. - * @return true if has deleted detector task in cache - */ - public boolean hasDeletedDetectorTask() { - return !deletedDetectorTasks.isEmpty(); - } - - /** - * Poll one deleted detector task. - * @return task id - */ - public String pollDeletedDetectorTask() { - return this.deletedDetectorTasks.poll(); - } - - /** - * Add deleted detector's id to deleted detector queue. - * @param detectorId detector id - */ - public void addDeletedDetector(String detectorId) { - if (deletedDetectors.size() < maxCachedDeletedTask) { - deletedDetectors.add(detectorId); - } - } - - /** - * Poll one deleted detector. - * @return detector id - */ - public String pollDeletedDetector() { - return this.deletedDetectors.poll(); - } - public String getDetectorTaskId(String detectorId) { return detectorTasks.get(detectorId); } @@ -1317,7 +1114,7 @@ public void cleanExpiredHCBatchTaskRunStates() { for (Map.Entry> detectorRunStates : hcBatchTaskRunState.entrySet()) { List taskIdOfExpiredStates = new ArrayList<>(); String detectorId = detectorRunStates.getKey(); - boolean noRunningTask = isNullOrEmpty(getTasksOfDetector(detectorId)); + boolean noRunningTask = ParseUtils.isNullOrEmpty(getTasksOfDetector(detectorId)); Map taskRunStates = detectorRunStates.getValue(); if (taskRunStates == null) { // If detector's task run state is null, add detector id to detectorIdOfEmptyStates and remove it from @@ -1362,32 +1159,4 @@ public void cleanExpiredHCBatchTaskRunStates() { } } - /** - * We query result index to check if there are any result generated for detector to tell whether it passed initialization of not. - * To avoid repeated query when there is no data, record whether we have done that or not. - * @param id detector id - */ - public void markResultIndexQueried(String id) { - ADRealtimeTaskCache realtimeTaskCache = realtimeTaskCaches.get(id); - // we initialize a real time cache at the beginning of AnomalyResultTransportAction if it - // cannot be found. If the cache is empty, we will return early and wait it for it to be - // initialized. - if (realtimeTaskCache != null) { - realtimeTaskCache.setQueriedResultIndex(true); - } - } - - /** - * We query result index to check if there are any result generated for detector to tell whether it passed initialization of not. - * - * @param id detector id - * @return whether we have queried result index or not. - */ - public boolean hasQueriedResultIndex(String id) { - ADRealtimeTaskCache realtimeTaskCache = realtimeTaskCaches.get(id); - if (realtimeTaskCache != null) { - return realtimeTaskCache.hasQueriedResultIndex(); - } - return false; - } } diff --git a/src/main/java/org/opensearch/ad/task/ADTaskManager.java b/src/main/java/org/opensearch/ad/task/ADTaskManager.java index 7a10dd738..268bbc26a 100644 --- a/src/main/java/org/opensearch/ad/task/ADTaskManager.java +++ b/src/main/java/org/opensearch/ad/task/ADTaskManager.java @@ -12,16 +12,12 @@ package org.opensearch.ad.task; import static org.opensearch.action.DocWriteResponse.Result.CREATED; -import static org.opensearch.ad.AnomalyDetectorPlugin.AD_BATCH_TASK_THREAD_POOL_NAME; -import static org.opensearch.ad.constant.CommonErrorMessages.CAN_NOT_FIND_LATEST_TASK; -import static org.opensearch.ad.constant.CommonErrorMessages.CREATE_INDEX_NOT_ACKNOWLEDGED; -import static org.opensearch.ad.constant.CommonErrorMessages.DETECTOR_IS_RUNNING; -import static org.opensearch.ad.constant.CommonErrorMessages.EXCEED_HISTORICAL_ANALYSIS_LIMIT; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_FIND_DETECTOR_MSG; -import static org.opensearch.ad.constant.CommonErrorMessages.HC_DETECTOR_TASK_IS_UPDATING; -import static org.opensearch.ad.constant.CommonErrorMessages.NO_ELIGIBLE_NODE_TO_RUN_DETECTOR; -import static org.opensearch.ad.constant.CommonName.DETECTION_STATE_INDEX; -import static org.opensearch.ad.indices.AnomalyDetectionIndices.ALL_AD_RESULTS_INDEX_PATTERN; +import static org.opensearch.ad.constant.ADCommonMessages.DETECTOR_IS_RUNNING; +import static org.opensearch.ad.constant.ADCommonMessages.EXCEED_HISTORICAL_ANALYSIS_LIMIT; +import static org.opensearch.ad.constant.ADCommonMessages.HC_DETECTOR_TASK_IS_UPDATING; +import static org.opensearch.ad.constant.ADCommonMessages.NO_ELIGIBLE_NODE_TO_RUN_DETECTOR; +import static org.opensearch.ad.constant.ADCommonName.DETECTION_STATE_INDEX; +import static org.opensearch.ad.indices.ADIndexManagement.ALL_AD_RESULTS_INDEX_PATTERN; import static org.opensearch.ad.model.ADTask.COORDINATING_NODE_FIELD; import static org.opensearch.ad.model.ADTask.DETECTOR_ID_FIELD; import static org.opensearch.ad.model.ADTask.ERROR_FIELD; @@ -36,30 +32,32 @@ import static org.opensearch.ad.model.ADTask.STOPPED_BY_FIELD; import static org.opensearch.ad.model.ADTask.TASK_PROGRESS_FIELD; import static org.opensearch.ad.model.ADTask.TASK_TYPE_FIELD; -import static org.opensearch.ad.model.ADTaskState.NOT_ENDED_STATES; import static org.opensearch.ad.model.ADTaskType.ALL_HISTORICAL_TASK_TYPES; import static org.opensearch.ad.model.ADTaskType.HISTORICAL_DETECTOR_TASK_TYPES; import static org.opensearch.ad.model.ADTaskType.REALTIME_TASK_TYPES; -import static org.opensearch.ad.model.ADTaskType.taskTypeToString; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; -import static org.opensearch.ad.model.AnomalyResult.TASK_ID_FIELD; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_REQUEST_TIMEOUT; import static org.opensearch.ad.settings.AnomalyDetectorSettings.BATCH_TASK_PIECE_INTERVAL_SECONDS; import static org.opensearch.ad.settings.AnomalyDetectorSettings.DELETE_AD_RESULT_WHEN_DELETE_DETECTOR; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_OLD_AD_TASK_DOCS; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_OLD_AD_TASK_DOCS_PER_DETECTOR; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_RUNNING_ENTITIES_PER_DETECTOR_FOR_HISTORICAL_ANALYSIS; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.NUM_MIN_SAMPLES; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.REQUEST_TIMEOUT; import static org.opensearch.ad.stats.InternalStatNames.AD_DETECTOR_ASSIGNED_BATCH_TASK_SLOT_COUNT; import static org.opensearch.ad.stats.InternalStatNames.AD_USED_BATCH_TASK_SLOT_COUNT; -import static org.opensearch.ad.util.ExceptionUtil.getErrorMessage; -import static org.opensearch.ad.util.ExceptionUtil.getShardsFailure; -import static org.opensearch.ad.util.ParseUtils.isNullOrEmpty; -import static org.opensearch.ad.util.RestHandlerUtils.XCONTENT_WITH_TYPE; -import static org.opensearch.ad.util.RestHandlerUtils.createXContentParserFromRegistry; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.TimeSeriesAnalyticsPlugin.AD_BATCH_TASK_THREAD_POOL_NAME; +import static org.opensearch.timeseries.constant.CommonMessages.CAN_NOT_FIND_LATEST_TASK; +import static org.opensearch.timeseries.constant.CommonMessages.CREATE_INDEX_NOT_ACKNOWLEDGED; +import static org.opensearch.timeseries.constant.CommonMessages.FAIL_TO_FIND_CONFIG_MSG; +import static org.opensearch.timeseries.constant.CommonName.TASK_ID_FIELD; +import static org.opensearch.timeseries.model.TaskState.NOT_ENDED_STATES; +import static org.opensearch.timeseries.model.TaskType.taskTypeToString; +import static org.opensearch.timeseries.settings.TimeSeriesSettings.NUM_MIN_SAMPLES; +import static org.opensearch.timeseries.util.ExceptionUtil.getErrorMessage; +import static org.opensearch.timeseries.util.ExceptionUtil.getShardsFailure; +import static org.opensearch.timeseries.util.ParseUtils.isNullOrEmpty; +import static org.opensearch.timeseries.util.RestHandlerUtils.XCONTENT_WITH_TYPE; +import static org.opensearch.timeseries.util.RestHandlerUtils.createXContentParserFromRegistry; import java.io.IOException; import java.time.Instant; @@ -102,25 +100,14 @@ import org.opensearch.action.update.UpdateRequest; import org.opensearch.action.update.UpdateResponse; import org.opensearch.ad.cluster.HashRing; -import org.opensearch.ad.common.exception.ADTaskCancelledException; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.DuplicateTaskException; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.common.exception.LimitExceededException; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.ADEntityTaskProfile; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.ADTaskAction; import org.opensearch.ad.model.ADTaskProfile; -import org.opensearch.ad.model.ADTaskState; import org.opensearch.ad.model.ADTaskType; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.model.DetectionDateRange; import org.opensearch.ad.model.DetectorProfile; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.rest.handler.AnomalyDetectorFunction; import org.opensearch.ad.rest.handler.IndexAnomalyDetectorJobActionHandler; import org.opensearch.ad.transport.ADBatchAnomalyResultAction; import org.opensearch.ad.transport.ADBatchAnomalyResultRequest; @@ -132,11 +119,8 @@ import org.opensearch.ad.transport.ADTaskProfileAction; import org.opensearch.ad.transport.ADTaskProfileNodeResponse; import org.opensearch.ad.transport.ADTaskProfileRequest; -import org.opensearch.ad.transport.AnomalyDetectorJobResponse; import org.opensearch.ad.transport.ForwardADTaskAction; import org.opensearch.ad.transport.ForwardADTaskRequest; -import org.opensearch.ad.util.DiscoveryNodeFilterer; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Client; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.service.ClusterService; @@ -169,6 +153,22 @@ import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.search.sort.SortOrder; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.common.exception.DuplicateTaskException; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.common.exception.TaskCancelledException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.model.TaskState; +import org.opensearch.timeseries.task.RealtimeTaskCache; +import org.opensearch.timeseries.transport.JobResponse; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; +import org.opensearch.timeseries.util.RestHandlerUtils; import org.opensearch.transport.TransportRequestOptions; import org.opensearch.transport.TransportService; @@ -190,7 +190,7 @@ public class ADTaskManager { private final Client client; private final ClusterService clusterService; private final NamedXContentRegistry xContentRegistry; - private final AnomalyDetectionIndices detectionIndices; + private final ADIndexManagement detectionIndices; private final DiscoveryNodeFilterer nodeFilter; private final ADTaskCacheManager adTaskCacheManager; @@ -214,7 +214,7 @@ public ADTaskManager( ClusterService clusterService, Client client, NamedXContentRegistry xContentRegistry, - AnomalyDetectionIndices detectionIndices, + ADIndexManagement detectionIndices, DiscoveryNodeFilterer nodeFilter, HashRing hashRing, ADTaskCacheManager adTaskCacheManager, @@ -252,9 +252,9 @@ public ADTaskManager( transportRequestOptions = TransportRequestOptions .builder() .withType(TransportRequestOptions.Type.REG) - .withTimeout(REQUEST_TIMEOUT.get(settings)) + .withTimeout(AD_REQUEST_TIMEOUT.get(settings)) .build(); - clusterService.getClusterSettings().addSettingsUpdateConsumer(REQUEST_TIMEOUT, it -> { + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_REQUEST_TIMEOUT, it -> { transportRequestOptions = TransportRequestOptions.builder().withType(TransportRequestOptions.Type.REG).withTimeout(it).build(); }); this.threadPool = threadPool; @@ -276,19 +276,19 @@ public ADTaskManager( */ public void startDetector( String detectorId, - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, IndexAnomalyDetectorJobActionHandler handler, User user, TransportService transportService, ThreadContext.StoredContext context, - ActionListener listener + ActionListener listener ) { // upgrade index mapping of AD default indices detectionIndices.update(); getDetector(detectorId, (detector) -> { if (!detector.isPresent()) { - listener.onFailure(new OpenSearchStatusException(FAIL_TO_FIND_DETECTOR_MSG + detectorId, RestStatus.NOT_FOUND)); + listener.onFailure(new OpenSearchStatusException(FAIL_TO_FIND_CONFIG_MSG + detectorId, RestStatus.NOT_FOUND)); return; } @@ -298,7 +298,7 @@ public void startDetector( listener.onFailure(new OpenSearchStatusException(errorMessage, RestStatus.BAD_REQUEST)); return; } - String resultIndex = detector.get().getResultIndex(); + String resultIndex = detector.get().getCustomResultIndex(); if (resultIndex == null) { startRealtimeOrHistoricalDetection(detectionDateRange, handler, user, transportService, listener, detector); return; @@ -315,11 +315,11 @@ public void startDetector( } private void startRealtimeOrHistoricalDetection( - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, IndexAnomalyDetectorJobActionHandler handler, User user, TransportService transportService, - ActionListener listener, + ActionListener listener, Optional detector ) { try (ThreadContext.StoredContext context = client.threadPool().getThreadContext().stashContext()) { @@ -352,10 +352,10 @@ private void startRealtimeOrHistoricalDetection( */ protected void forwardApplyForTaskSlotsRequestToLeadNode( AnomalyDetector detector, - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, User user, TransportService transportService, - ActionListener listener + ActionListener listener ) { ForwardADTaskRequest forwardADTaskRequest = new ForwardADTaskRequest( detector, @@ -369,7 +369,7 @@ protected void forwardApplyForTaskSlotsRequestToLeadNode( public void forwardScaleTaskSlotRequestToLeadNode( ADTask adTask, TransportService transportService, - ActionListener listener + ActionListener listener ) { forwardRequestToLeadNode(new ForwardADTaskRequest(adTask, ADTaskAction.CHECK_AVAILABLE_TASK_SLOTS), transportService, listener); } @@ -377,7 +377,7 @@ public void forwardScaleTaskSlotRequestToLeadNode( public void forwardRequestToLeadNode( ForwardADTaskRequest forwardADTaskRequest, TransportService transportService, - ActionListener listener + ActionListener listener ) { hashRing.buildAndGetOwningNodeWithSameLocalAdVersion(AD_TASK_LEAD_NODE_MODEL_ID, node -> { if (!node.isPresent()) { @@ -390,7 +390,7 @@ public void forwardRequestToLeadNode( ForwardADTaskAction.NAME, forwardADTaskRequest, transportRequestOptions, - new ActionListenerResponseHandler<>(listener, AnomalyDetectorJobResponse::new) + new ActionListenerResponseHandler<>(listener, JobResponse::new) ); }, listener); } @@ -407,13 +407,13 @@ public void forwardRequestToLeadNode( */ public void startHistoricalAnalysis( AnomalyDetector detector, - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, User user, int availableTaskSlots, TransportService transportService, - ActionListener listener + ActionListener listener ) { - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); hashRing.buildAndGetOwningNodeWithSameLocalAdVersion(detectorId, owningNode -> { if (!owningNode.isPresent()) { logger.debug("Can't find eligible node to run as AD task's coordinating node"); @@ -459,13 +459,13 @@ public void startHistoricalAnalysis( */ protected void forwardDetectRequestToCoordinatingNode( AnomalyDetector detector, - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, User user, Integer availableTaskSlots, ADTaskAction adTaskAction, TransportService transportService, DiscoveryNode node, - ActionListener listener + ActionListener listener ) { Version adVersion = hashRing.getAdVersion(node.getId()); transportService @@ -476,7 +476,7 @@ protected void forwardDetectRequestToCoordinatingNode( // node, check ADTaskManager#cleanDetectorCache. new ForwardADTaskRequest(detector, detectionDateRange, user, adTaskAction, availableTaskSlots, adVersion), transportRequestOptions, - new ActionListenerResponseHandler<>(listener, AnomalyDetectorJobResponse::new) + new ActionListenerResponseHandler<>(listener, JobResponse::new) ); } @@ -492,7 +492,7 @@ protected void forwardADTaskToCoordinatingNode( ADTask adTask, ADTaskAction adTaskAction, TransportService transportService, - ActionListener listener + ActionListener listener ) { logger.debug("Forward AD task to coordinating node, task id: {}, action: {}", adTask.getTaskId(), adTaskAction.name()); transportService @@ -501,7 +501,7 @@ protected void forwardADTaskToCoordinatingNode( ForwardADTaskAction.NAME, new ForwardADTaskRequest(adTask, adTaskAction), transportRequestOptions, - new ActionListenerResponseHandler<>(listener, AnomalyDetectorJobResponse::new) + new ActionListenerResponseHandler<>(listener, JobResponse::new) ); } @@ -519,7 +519,7 @@ protected void forwardStaleRunningEntitiesToCoordinatingNode( ADTaskAction adTaskAction, TransportService transportService, List staleRunningEntity, - ActionListener listener + ActionListener listener ) { transportService .sendRequest( @@ -527,7 +527,7 @@ protected void forwardStaleRunningEntitiesToCoordinatingNode( ForwardADTaskAction.NAME, new ForwardADTaskRequest(adTask, adTaskAction, staleRunningEntity), transportRequestOptions, - new ActionListenerResponseHandler<>(listener, AnomalyDetectorJobResponse::new) + new ActionListenerResponseHandler<>(listener, JobResponse::new) ); } @@ -547,13 +547,13 @@ protected void forwardStaleRunningEntitiesToCoordinatingNode( public void checkTaskSlots( ADTask adTask, AnomalyDetector detector, - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, User user, ADTaskAction afterCheckAction, TransportService transportService, - ActionListener listener + ActionListener listener ) { - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); logger.debug("Start checking task slots for detector: {}, task action: {}", detectorId, afterCheckAction); if (!checkingTaskSlot.tryAcquire()) { logger.info("Can't acquire checking task slot semaphore for detector {}", detectorId); @@ -566,7 +566,7 @@ public void checkTaskSlots( ); return; } - ActionListener wrappedActionListener = ActionListener.runAfter(listener, () -> { + ActionListener wrappedActionListener = ActionListener.runAfter(listener, () -> { checkingTaskSlot.release(1); logger.debug("Release checking task slot semaphore on lead node for detector {}", detectorId); }); @@ -623,9 +623,7 @@ public void checkTaskSlots( // then we will assign 4 tasks slots to this HC detector (4 is less than 8). The data index // only has 2 entities. So we assign 2 more task slots than actual need. But it's ok as we // will auto tune task slot when historical analysis task starts. - int approvedTaskSlots = detector.isMultientityDetector() - ? Math.min(maxRunningEntitiesPerDetector, availableAdTaskSlots) - : 1; + int approvedTaskSlots = detector.isHighCardinality() ? Math.min(maxRunningEntitiesPerDetector, availableAdTaskSlots) : 1; forwardToCoordinatingNode( adTask, detector, @@ -646,21 +644,16 @@ public void checkTaskSlots( private void forwardToCoordinatingNode( ADTask adTask, AnomalyDetector detector, - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, User user, ADTaskAction targetActionOfTaskSlotChecking, TransportService transportService, - ActionListener wrappedActionListener, + ActionListener wrappedActionListener, int approvedTaskSlots ) { switch (targetActionOfTaskSlotChecking) { case START: - logger - .info( - "Will assign {} task slots to run historical analysis for detector {}", - approvedTaskSlots, - detector.getDetectorId() - ); + logger.info("Will assign {} task slots to run historical analysis for detector {}", approvedTaskSlots, detector.getId()); startHistoricalAnalysis(detector, detectionDateRange, user, approvedTaskSlots, transportService, wrappedActionListener); break; case SCALE_ENTITY_TASK_SLOTS: @@ -668,12 +661,12 @@ private void forwardToCoordinatingNode( .info( "There are {} task slots available now to scale historical analysis task lane for detector {}", approvedTaskSlots, - adTask.getDetectorId() + adTask.getConfigId() ); scaleTaskLaneOnCoordinatingNode(adTask, approvedTaskSlots, transportService, wrappedActionListener); break; default: - wrappedActionListener.onFailure(new AnomalyDetectionException("Unknown task action " + targetActionOfTaskSlotChecking)); + wrappedActionListener.onFailure(new TimeSeriesException("Unknown task action " + targetActionOfTaskSlotChecking)); break; } } @@ -682,7 +675,7 @@ protected void scaleTaskLaneOnCoordinatingNode( ADTask adTask, int approvedTaskSlot, TransportService transportService, - ActionListener listener + ActionListener listener ) { DiscoveryNode coordinatingNode = getCoordinatingNode(adTask); transportService @@ -691,7 +684,7 @@ protected void scaleTaskLaneOnCoordinatingNode( ForwardADTaskAction.NAME, new ForwardADTaskRequest(adTask, approvedTaskSlot, ADTaskAction.SCALE_ENTITY_TASK_SLOTS), transportRequestOptions, - new ActionListenerResponseHandler<>(listener, AnomalyDetectorJobResponse::new) + new ActionListenerResponseHandler<>(listener, JobResponse::new) ); } @@ -706,7 +699,7 @@ private DiscoveryNode getCoordinatingNode(ADTask adTask) { } } if (targetNode == null) { - throw new ResourceNotFoundException(adTask.getDetectorId(), "AD task coordinating node not found"); + throw new ResourceNotFoundException(adTask.getConfigId(), "AD task coordinating node not found"); } return targetNode; } @@ -730,15 +723,15 @@ private DiscoveryNode getCoordinatingNode(ADTask adTask) { */ public void startDetector( AnomalyDetector detector, - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, User user, TransportService transportService, - ActionListener listener + ActionListener listener ) { try { - if (detectionIndices.doesDetectorStateIndexExist()) { + if (detectionIndices.doesStateIndexExist()) { // If detection index exist, check if latest AD task is running - getAndExecuteOnLatestDetectorLevelTask(detector.getDetectorId(), getADTaskTypes(detectionDateRange), (adTask) -> { + getAndExecuteOnLatestDetectorLevelTask(detector.getId(), getADTaskTypes(detectionDateRange), (adTask) -> { if (!adTask.isPresent() || adTask.get().isDone()) { updateLatestFlagOfOldTasksAndCreateNewTask(detector, detectionDateRange, user, listener); } else { @@ -747,7 +740,7 @@ public void startDetector( }, transportService, true, listener); } else { // If detection index doesn't exist, create index and execute detector. - detectionIndices.initDetectionStateIndex(ActionListener.wrap(r -> { + detectionIndices.initStateIndex(ActionListener.wrap(r -> { if (r.isAcknowledged()) { logger.info("Created {} with mappings.", DETECTION_STATE_INDEX); updateLatestFlagOfOldTasksAndCreateNewTask(detector, detectionDateRange, user, listener); @@ -766,20 +759,20 @@ public void startDetector( })); } } catch (Exception e) { - logger.error("Failed to start detector " + detector.getDetectorId(), e); + logger.error("Failed to start detector " + detector.getId(), e); listener.onFailure(e); } } - private ADTaskType getADTaskType(AnomalyDetector detector, DetectionDateRange detectionDateRange) { + private ADTaskType getADTaskType(AnomalyDetector detector, DateRange detectionDateRange) { if (detectionDateRange == null) { - return detector.isMultientityDetector() ? ADTaskType.REALTIME_HC_DETECTOR : ADTaskType.REALTIME_SINGLE_ENTITY; + return detector.isHighCardinality() ? ADTaskType.REALTIME_HC_DETECTOR : ADTaskType.REALTIME_SINGLE_ENTITY; } else { - return detector.isMultientityDetector() ? ADTaskType.HISTORICAL_HC_DETECTOR : ADTaskType.HISTORICAL_SINGLE_ENTITY; + return detector.isHighCardinality() ? ADTaskType.HISTORICAL_HC_DETECTOR : ADTaskType.HISTORICAL_SINGLE_ENTITY; } } - private List getADTaskTypes(DetectionDateRange detectionDateRange) { + private List getADTaskTypes(DateRange detectionDateRange) { return getADTaskTypes(detectionDateRange, false); } @@ -793,7 +786,7 @@ private List getADTaskTypes(DetectionDateRange detectionDateRange) { * @param resetLatestTaskStateFlag reset latest task state or not * @return list of AD task types */ - private List getADTaskTypes(DetectionDateRange detectionDateRange, boolean resetLatestTaskStateFlag) { + private List getADTaskTypes(DateRange detectionDateRange, boolean resetLatestTaskStateFlag) { if (detectionDateRange == null) { return REALTIME_TASK_TYPES; } else { @@ -824,11 +817,11 @@ public void stopDetector( IndexAnomalyDetectorJobActionHandler handler, User user, TransportService transportService, - ActionListener listener + ActionListener listener ) { getDetector(detectorId, (detector) -> { if (!detector.isPresent()) { - listener.onFailure(new OpenSearchStatusException(FAIL_TO_FIND_DETECTOR_MSG + detectorId, RestStatus.NOT_FOUND)); + listener.onFailure(new OpenSearchStatusException(FAIL_TO_FIND_CONFIG_MSG + detectorId, RestStatus.NOT_FOUND)); return; } if (historical) { @@ -858,7 +851,7 @@ public void stopDetector( * @param action listener response type */ public void getDetector(String detectorId, Consumer> function, ActionListener listener) { - GetRequest getRequest = new GetRequest(ANOMALY_DETECTORS_INDEX, detectorId); + GetRequest getRequest = new GetRequest(CommonName.CONFIG_INDEX, detectorId); client.get(getRequest, ActionListener.wrap(response -> { if (!response.isExists()) { function.accept(Optional.empty()); @@ -1076,7 +1069,7 @@ private void resetLatestDetectorTaskState( private void resetRealtimeDetectorTaskState( List runningRealtimeTasks, - AnomalyDetectorFunction function, + ExecutorFunction function, TransportService transportService, ActionListener listener ) { @@ -1085,13 +1078,13 @@ private void resetRealtimeDetectorTaskState( return; } ADTask adTask = runningRealtimeTasks.get(0); - String detectorId = adTask.getDetectorId(); - GetRequest getJobRequest = new GetRequest(ANOMALY_DETECTOR_JOB_INDEX).id(detectorId); + String detectorId = adTask.getConfigId(); + GetRequest getJobRequest = new GetRequest(CommonName.JOB_INDEX).id(detectorId); client.get(getJobRequest, ActionListener.wrap(r -> { if (r.isExists()) { try (XContentParser parser = createXContentParserFromRegistry(xContentRegistry, r.getSourceAsBytesRef())) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - AnomalyDetectorJob job = AnomalyDetectorJob.parse(parser); + Job job = Job.parse(parser); if (!job.isEnabled()) { logger.debug("AD job is disabled, reset realtime task as stopped for detector {}", detectorId); resetTaskStateAsStopped(adTask, function, transportService, listener); @@ -1114,7 +1107,7 @@ private void resetRealtimeDetectorTaskState( private void resetHistoricalDetectorTaskState( List runningHistoricalTasks, - AnomalyDetectorFunction function, + ExecutorFunction function, TransportService transportService, ActionListener listener ) { @@ -1138,11 +1131,11 @@ private void resetHistoricalDetectorTaskState( if (taskStopped) { logger.debug("Reset task state as stopped, task id: {}", adTask.getTaskId()); if (taskProfile.getTaskId() == null // This means coordinating node doesn't have HC detector cache - && detector.isMultientityDetector() + && detector.isHighCardinality() && !isNullOrEmpty(taskProfile.getEntityTaskProfiles())) { // If coordinating node restarted, HC detector cache on it will be gone. But worker node still // runs entity tasks, we'd better stop these entity tasks to clean up resource earlier. - stopHistoricalAnalysis(adTask.getDetectorId(), Optional.of(adTask), null, ActionListener.wrap(r -> { + stopHistoricalAnalysis(adTask.getConfigId(), Optional.of(adTask), null, ActionListener.wrap(r -> { logger.debug("Restop detector successfully"); resetTaskStateAsStopped(adTask, function, transportService, listener); }, e -> { @@ -1191,17 +1184,17 @@ private void resetHistoricalDetectorTaskState( } private boolean isTaskStopped(String taskId, AnomalyDetector detector, ADTaskProfile taskProfile) { - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); if (taskProfile == null || !Objects.equals(taskId, taskProfile.getTaskId())) { logger.debug("AD task not found for task {} detector {}", taskId, detectorId); // If no node is running this task, reset it as STOPPED. return true; } - if (!detector.isMultientityDetector() && taskProfile.getNodeId() == null) { + if (!detector.isHighCardinality() && taskProfile.getNodeId() == null) { logger.debug("AD task not running for single entity detector {}, task {}", detectorId, taskId); return true; } - if (detector.isMultientityDetector() + if (detector.isHighCardinality() && taskProfile.getTotalEntitiesInited() && isNullOrEmpty(taskProfile.getRunningEntities()) && isNullOrEmpty(taskProfile.getEntityTaskProfiles()) @@ -1219,12 +1212,7 @@ public boolean hcBatchTaskExpired(Long latestHCTaskRunTime) { return latestHCTaskRunTime + HC_BATCH_TASK_CACHE_TIMEOUT_IN_MILLIS < Instant.now().toEpochMilli(); } - private void stopHistoricalAnalysis( - String detectorId, - Optional adTask, - User user, - ActionListener listener - ) { + private void stopHistoricalAnalysis(String detectorId, Optional adTask, User user, ActionListener listener) { if (!adTask.isPresent()) { listener.onFailure(new ResourceNotFoundException(detectorId, "Detector not started")); return; @@ -1241,7 +1229,7 @@ private void stopHistoricalAnalysis( ADCancelTaskRequest cancelTaskRequest = new ADCancelTaskRequest(detectorId, taskId, userName, dataNodes); client.execute(ADCancelTaskAction.INSTANCE, cancelTaskRequest, ActionListener.wrap(response -> { - listener.onResponse(new AnomalyDetectorJobResponse(taskId, 0, 0, 0, RestStatus.OK)); + listener.onResponse(new JobResponse(taskId)); }, e -> { logger.error("Failed to cancel AD task " + taskId + ", detector id: " + detectorId, e); listener.onFailure(e); @@ -1256,15 +1244,15 @@ private boolean lastUpdateTimeOfHistoricalTaskExpired(ADTask adTask) { private void resetTaskStateAsStopped( ADTask adTask, - AnomalyDetectorFunction function, + ExecutorFunction function, TransportService transportService, ActionListener listener ) { cleanDetectorCache(adTask, transportService, () -> { String taskId = adTask.getTaskId(); - Map updatedFields = ImmutableMap.of(STATE_FIELD, ADTaskState.STOPPED.name()); + Map updatedFields = ImmutableMap.of(STATE_FIELD, TaskState.STOPPED.name()); updateADTask(taskId, updatedFields, ActionListener.wrap(r -> { - adTask.setState(ADTaskState.STOPPED.name()); + adTask.setState(TaskState.STOPPED.name()); if (function != null) { function.execute(); } @@ -1289,7 +1277,7 @@ private void resetEntityTasksAsStopped(String detectorTaskId) { query.filter(new TermsQueryBuilder(STATE_FIELD, NOT_ENDED_STATES)); updateByQueryRequest.setQuery(query); updateByQueryRequest.setRefresh(true); - String script = String.format(Locale.ROOT, "ctx._source.%s='%s';", STATE_FIELD, ADTaskState.STOPPED.name()); + String script = String.format(Locale.ROOT, "ctx._source.%s='%s';", STATE_FIELD, TaskState.STOPPED.name()); updateByQueryRequest.setScript(new Script(script)); client.execute(UpdateByQueryAction.INSTANCE, updateByQueryRequest, ActionListener.wrap(r -> { @@ -1320,11 +1308,11 @@ private void resetEntityTasksAsStopped(String detectorTaskId) { public void cleanDetectorCache( ADTask adTask, TransportService transportService, - AnomalyDetectorFunction function, + ExecutorFunction function, ActionListener listener ) { String coordinatingNode = adTask.getCoordinatingNode(); - String detectorId = adTask.getDetectorId(); + String detectorId = adTask.getConfigId(); String taskId = adTask.getTaskId(); try { forwardADTaskToCoordinatingNode( @@ -1351,8 +1339,8 @@ public void cleanDetectorCache( } } - protected void cleanDetectorCache(ADTask adTask, TransportService transportService, AnomalyDetectorFunction function) { - String detectorId = adTask.getDetectorId(); + protected void cleanDetectorCache(ADTask adTask, TransportService transportService, ExecutorFunction function) { + String detectorId = adTask.getConfigId(); String taskId = adTask.getTaskId(); cleanDetectorCache(adTask, transportService, function, ActionListener.wrap(r -> { logger.debug("Successfully cleaned cache for detector {}, task {}", detectorId, taskId); @@ -1399,7 +1387,7 @@ public void getLatestHistoricalTaskProfile( * @param listener action listener */ private void getADTaskProfile(ADTask adDetectorLevelTask, ActionListener listener) { - String detectorId = adDetectorLevelTask.getDetectorId(); + String detectorId = adDetectorLevelTask.getConfigId(); hashRing.getAllEligibleDataNodesWithKnownAdVersion(dataNodes -> { ADTaskProfileRequest adTaskProfileRequest = new ADTaskProfileRequest(detectorId, dataNodes); @@ -1460,14 +1448,14 @@ private String validateDetector(AnomalyDetector detector) { private void updateLatestFlagOfOldTasksAndCreateNewTask( AnomalyDetector detector, - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, User user, - ActionListener listener + ActionListener listener ) { UpdateByQueryRequest updateByQueryRequest = new UpdateByQueryRequest(); updateByQueryRequest.indices(DETECTION_STATE_INDEX); BoolQueryBuilder query = new BoolQueryBuilder(); - query.filter(new TermQueryBuilder(DETECTOR_ID_FIELD, detector.getDetectorId())); + query.filter(new TermQueryBuilder(DETECTOR_ID_FIELD, detector.getId())); query.filter(new TermQueryBuilder(IS_LATEST_FIELD, true)); // make sure we reset all latest task as false when user switch from single entity to HC, vice versa. query.filter(new TermsQueryBuilder(TASK_TYPE_FIELD, taskTypeToString(getADTaskTypes(detectionDateRange, true)))); @@ -1487,34 +1475,34 @@ private void updateLatestFlagOfOldTasksAndCreateNewTask( String coordinatingNode = detectionDateRange == null ? null : clusterService.localNode().getId(); createNewADTask(detector, detectionDateRange, user, coordinatingNode, listener); } else { - logger.error("Failed to update old task's state for detector: {}, response: {} ", detector.getDetectorId(), r.toString()); + logger.error("Failed to update old task's state for detector: {}, response: {} ", detector.getId(), r.toString()); listener.onFailure(bulkFailures.get(0).getCause()); } }, e -> { - logger.error("Failed to reset old tasks as not latest for detector " + detector.getDetectorId(), e); + logger.error("Failed to reset old tasks as not latest for detector " + detector.getId(), e); listener.onFailure(e); })); } private void createNewADTask( AnomalyDetector detector, - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, User user, String coordinatingNode, - ActionListener listener + ActionListener listener ) { String userName = user == null ? null : user.getName(); Instant now = Instant.now(); String taskType = getADTaskType(detector, detectionDateRange).name(); ADTask adTask = new ADTask.Builder() - .detectorId(detector.getDetectorId()) + .configId(detector.getId()) .detector(detector) .isLatest(true) .taskType(taskType) .executionStartTime(now) .taskProgress(0.0f) .initProgress(0.0f) - .state(ADTaskState.CREATED.name()) + .state(TaskState.CREATED.name()) .lastUpdateTime(now) .startedBy(userName) .coordinatingNode(coordinatingNode) @@ -1550,11 +1538,11 @@ public void createADTaskDirectly(ADTask adTask, Consumer func .source(adTask.toXContent(builder, RestHandlerUtils.XCONTENT_WITH_TYPE)) .setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE); client.index(request, ActionListener.wrap(r -> function.accept(r), e -> { - logger.error("Failed to create AD task for detector " + adTask.getDetectorId(), e); + logger.error("Failed to create AD task for detector " + adTask.getConfigId(), e); listener.onFailure(e); })); } catch (Exception e) { - logger.error("Failed to create AD task for detector " + adTask.getDetectorId(), e); + logger.error("Failed to create AD task for detector " + adTask.getConfigId(), e); listener.onFailure(e); } } @@ -1562,8 +1550,8 @@ public void createADTaskDirectly(ADTask adTask, Consumer func private void onIndexADTaskResponse( IndexResponse response, ADTask adTask, - BiConsumer> function, - ActionListener listener + BiConsumer> function, + ActionListener listener ) { if (response == null || response.getResult() != CREATED) { String errorMsg = getShardsFailure(response); @@ -1571,7 +1559,7 @@ private void onIndexADTaskResponse( return; } adTask.setTaskId(response.getId()); - ActionListener delegatedListener = ActionListener.wrap(r -> { listener.onResponse(r); }, e -> { + ActionListener delegatedListener = ActionListener.wrap(r -> { listener.onResponse(r); }, e -> { handleADTaskException(adTask, e); if (e instanceof DuplicateTaskException) { listener.onFailure(new OpenSearchStatusException(DETECTOR_IS_RUNNING, RestStatus.BAD_REQUEST)); @@ -1581,7 +1569,7 @@ private void onIndexADTaskResponse( // ADTaskManager#initRealtimeTaskCacheAndCleanupStaleCache for details. Here the // realtime task cache not inited yet when create AD task, so no need to cleanup. if (adTask.isHistoricalTask()) { - adTaskCacheManager.removeHistoricalTaskCache(adTask.getDetectorId()); + adTaskCacheManager.removeHistoricalTaskCache(adTask.getConfigId()); } listener.onFailure(e); } @@ -1591,7 +1579,7 @@ private void onIndexADTaskResponse( // DuplicateTaskException. This is to solve race condition when user send // multiple start request for one historical detector. if (adTask.isHistoricalTask()) { - adTaskCacheManager.add(adTask.getDetectorId(), adTask); + adTaskCacheManager.add(adTask.getConfigId(), adTask); } } catch (Exception e) { delegatedListener.onFailure(e); @@ -1602,9 +1590,9 @@ private void onIndexADTaskResponse( } } - private void cleanOldAdTaskDocs(IndexResponse response, ADTask adTask, ActionListener delegatedListener) { + private void cleanOldAdTaskDocs(IndexResponse response, ADTask adTask, ActionListener delegatedListener) { BoolQueryBuilder query = new BoolQueryBuilder(); - query.filter(new TermQueryBuilder(DETECTOR_ID_FIELD, adTask.getDetectorId())); + query.filter(new TermQueryBuilder(DETECTOR_ID_FIELD, adTask.getConfigId())); query.filter(new TermQueryBuilder(IS_LATEST_FIELD, false)); if (adTask.isHistoricalTask()) { @@ -1625,7 +1613,7 @@ private void cleanOldAdTaskDocs(IndexResponse response, ADTask adTask, ActionLis .from(maxOldAdTaskDocsPerDetector) .size(MAX_OLD_AD_TASK_DOCS); searchRequest.source(sourceBuilder).indices(DETECTION_STATE_INDEX); - String detectorId = adTask.getDetectorId(); + String detectorId = adTask.getConfigId(); deleteTaskDocs(detectorId, searchRequest, () -> { if (adTask.isHistoricalTask()) { @@ -1633,13 +1621,7 @@ private void cleanOldAdTaskDocs(IndexResponse response, ADTask adTask, ActionLis runBatchResultAction(response, adTask, delegatedListener); } else { // return response directly for realtime detection - AnomalyDetectorJobResponse anomalyDetectorJobResponse = new AnomalyDetectorJobResponse( - response.getId(), - response.getVersion(), - response.getSeqNo(), - response.getPrimaryTerm(), - RestStatus.OK - ); + JobResponse anomalyDetectorJobResponse = new JobResponse(response.getId()); delegatedListener.onResponse(anomalyDetectorJobResponse); } }, delegatedListener); @@ -1648,7 +1630,7 @@ private void cleanOldAdTaskDocs(IndexResponse response, ADTask adTask, ActionLis protected void deleteTaskDocs( String detectorId, SearchRequest searchRequest, - AnomalyDetectorFunction function, + ExecutorFunction function, ActionListener listener ) { ActionListener searchListener = ActionListener.wrap(r -> { @@ -1660,7 +1642,7 @@ protected void deleteTaskDocs( try (XContentParser parser = createXContentParserFromRegistry(xContentRegistry, searchHit.getSourceRef())) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); ADTask adTask = ADTask.parse(parser, searchHit.getId()); - logger.debug("Delete old task: {} of detector: {}", adTask.getTaskId(), adTask.getDetectorId()); + logger.debug("Delete old task: {} of detector: {}", adTask.getTaskId(), adTask.getConfigId()); bulkRequest.add(new DeleteRequest(DETECTION_STATE_INDEX).id(adTask.getTaskId())); } catch (Exception e) { listener.onFailure(e); @@ -1674,7 +1656,7 @@ protected void deleteTaskDocs( if (!bulkItemResponse.isFailed()) { logger.debug("Add detector task into cache. Task id: {}", bulkItemResponse.getId()); // add deleted task in cache and delete its child tasks and AD results - adTaskCacheManager.addDeletedDetectorTask(bulkItemResponse.getId()); + adTaskCacheManager.addDeletedTask(bulkItemResponse.getId()); } } } @@ -1704,11 +1686,11 @@ protected void deleteTaskDocs( * Poll deleted detector task from cache and delete its child tasks and AD results. */ public void cleanChildTasksAndADResultsOfDeletedTask() { - if (!adTaskCacheManager.hasDeletedDetectorTask()) { + if (!adTaskCacheManager.hasDeletedTask()) { return; } threadPool.schedule(() -> { - String taskId = adTaskCacheManager.pollDeletedDetectorTask(); + String taskId = adTaskCacheManager.pollDeletedTask(); if (taskId == null) { return; } @@ -1727,24 +1709,18 @@ public void cleanChildTasksAndADResultsOfDeletedTask() { }, TimeValue.timeValueSeconds(DEFAULT_MAINTAIN_INTERVAL_IN_SECONDS), AD_BATCH_TASK_THREAD_POOL_NAME); } - private void runBatchResultAction(IndexResponse response, ADTask adTask, ActionListener listener) { + private void runBatchResultAction(IndexResponse response, ADTask adTask, ActionListener listener) { client.execute(ADBatchAnomalyResultAction.INSTANCE, new ADBatchAnomalyResultRequest(adTask), ActionListener.wrap(r -> { String remoteOrLocal = r.isRunTaskRemotely() ? "remote" : "local"; logger .info( "AD task {} of detector {} dispatched to {} node {}", adTask.getTaskId(), - adTask.getDetectorId(), + adTask.getConfigId(), remoteOrLocal, r.getNodeId() ); - AnomalyDetectorJobResponse anomalyDetectorJobResponse = new AnomalyDetectorJobResponse( - response.getId(), - response.getVersion(), - response.getSeqNo(), - response.getPrimaryTerm(), - RestStatus.OK - ); + JobResponse anomalyDetectorJobResponse = new JobResponse(response.getId()); listener.onResponse(anomalyDetectorJobResponse); }, e -> listener.onFailure(e))); } @@ -1757,7 +1733,7 @@ private void runBatchResultAction(IndexResponse response, ADTask adTask, ActionL */ public void handleADTaskException(ADTask adTask, Exception e) { // TODO: handle timeout exception - String state = ADTaskState.FAILED.name(); + String state = TaskState.FAILED.name(); Map updatedFields = new HashMap<>(); if (e instanceof DuplicateTaskException) { // If user send multiple start detector request, we will meet race condition. @@ -1766,22 +1742,22 @@ public void handleADTaskException(ADTask adTask, Exception e) { logger .warn( "There is already one running task for detector, detectorId:" - + adTask.getDetectorId() + + adTask.getConfigId() + ". Will delete task " + adTask.getTaskId() ); deleteADTask(adTask.getTaskId()); return; } - if (e instanceof ADTaskCancelledException) { - logger.info("AD task cancelled, taskId: {}, detectorId: {}", adTask.getTaskId(), adTask.getDetectorId()); - state = ADTaskState.STOPPED.name(); - String stoppedBy = ((ADTaskCancelledException) e).getCancelledBy(); + if (e instanceof TaskCancelledException) { + logger.info("AD task cancelled, taskId: {}, detectorId: {}", adTask.getTaskId(), adTask.getConfigId()); + state = TaskState.STOPPED.name(); + String stoppedBy = ((TaskCancelledException) e).getCancelledBy(); if (stoppedBy != null) { updatedFields.put(STOPPED_BY_FIELD, stoppedBy); } } else { - logger.error("Failed to execute AD batch task, task id: " + adTask.getTaskId() + ", detector id: " + adTask.getDetectorId(), e); + logger.error("Failed to execute AD batch task, task id: " + adTask.getTaskId() + ", detector id: " + adTask.getConfigId(), e); } updatedFields.put(ERROR_FIELD, getErrorMessage(e)); updatedFields.put(STATE_FIELD, state); @@ -1874,7 +1850,7 @@ public ADTaskCancellationState cancelLocalTaskByDetectorId(String detectorId, St * @param function AD function * @param listener action listener */ - public void deleteADTasks(String detectorId, AnomalyDetectorFunction function, ActionListener listener) { + public void deleteADTasks(String detectorId, ExecutorFunction function, ActionListener listener) { DeleteByQueryRequest request = new DeleteByQueryRequest(DETECTION_STATE_INDEX); BoolQueryBuilder query = new BoolQueryBuilder(); @@ -1912,7 +1888,7 @@ private void deleteADResultOfDetector(String detectorId) { logger.debug("Successfully deleted AD results of detector " + detectorId); }, exception -> { logger.error("Failed to delete AD results of detector " + detectorId, exception); - adTaskCacheManager.addDeletedDetector(detectorId); + adTaskCacheManager.addDeletedConfig(detectorId); })); } @@ -1920,7 +1896,7 @@ private void deleteADResultOfDetector(String detectorId) { * Clean AD results of deleted detector. */ public void cleanADResultOfDeletedDetector() { - String detectorId = adTaskCacheManager.pollDeletedDetector(); + String detectorId = adTaskCacheManager.pollDeletedConfig(); if (detectorId != null) { deleteADResultOfDetector(detectorId); } @@ -1960,10 +1936,10 @@ public void updateLatestADTask( */ public void stopLatestRealtimeTask( String detectorId, - ADTaskState state, + TaskState state, Exception error, TransportService transportService, - ActionListener listener + ActionListener listener ) { getAndExecuteOnLatestDetectorLevelTask(detectorId, REALTIME_TASK_TYPES, (adTask) -> { if (adTask.isPresent() && !adTask.get().isDone()) { @@ -1972,9 +1948,9 @@ public void stopLatestRealtimeTask( if (error != null) { updatedFields.put(ADTask.ERROR_FIELD, error.getMessage()); } - AnomalyDetectorFunction function = () -> updateADTask(adTask.get().getTaskId(), updatedFields, ActionListener.wrap(r -> { + ExecutorFunction function = () -> updateADTask(adTask.get().getTaskId(), updatedFields, ActionListener.wrap(r -> { if (error == null) { - listener.onResponse(new AnomalyDetectorJobResponse(detectorId, 0, 0, 0, RestStatus.OK)); + listener.onResponse(new JobResponse(detectorId)); } else { listener.onFailure(error); } @@ -2014,11 +1990,11 @@ public void updateLatestRealtimeTaskOnCoordinatingNode( String newState = null; // calculate init progress and task state with RCF total updates if (detectorIntervalInMinutes != null && rcfTotalUpdates != null) { - newState = ADTaskState.INIT.name(); + newState = TaskState.INIT.name(); if (rcfTotalUpdates < NUM_MIN_SAMPLES) { initProgress = (float) rcfTotalUpdates / NUM_MIN_SAMPLES; } else { - newState = ADTaskState.RUNNING.name(); + newState = TaskState.RUNNING.name(); initProgress = 1.0f; } } @@ -2091,14 +2067,14 @@ public void initRealtimeTaskCacheAndCleanupStaleCache( getAndExecuteOnLatestDetectorLevelTask(detectorId, REALTIME_TASK_TYPES, (adTaskOptional) -> { if (!adTaskOptional.isPresent()) { logger.debug("Can't find realtime task for detector {}, init realtime task cache directly", detectorId); - AnomalyDetectorFunction function = () -> createNewADTask( + ExecutorFunction function = () -> createNewADTask( detector, null, detector.getUser(), clusterService.localNode().getId(), ActionListener.wrap(r -> { logger.info("Recreate realtime task successfully for detector {}", detectorId); - adTaskCacheManager.initRealtimeTaskCache(detectorId, detector.getDetectorIntervalInMilliseconds()); + adTaskCacheManager.initRealtimeTaskCache(detectorId, detector.getIntervalInMilliseconds()); listener.onResponse(true); }, e -> { logger.error("Failed to recreate realtime task for detector " + detectorId, e); @@ -2127,12 +2103,12 @@ public void initRealtimeTaskCacheAndCleanupStaleCache( oldCoordinatingNode, detectorId ); - adTaskCacheManager.initRealtimeTaskCache(detectorId, detector.getDetectorIntervalInMilliseconds()); + adTaskCacheManager.initRealtimeTaskCache(detectorId, detector.getIntervalInMilliseconds()); listener.onResponse(true); }, listener); } else { logger.info("Init realtime task cache for detector {}", detectorId); - adTaskCacheManager.initRealtimeTaskCache(detectorId, detector.getDetectorIntervalInMilliseconds()); + adTaskCacheManager.initRealtimeTaskCache(detectorId, detector.getIntervalInMilliseconds()); listener.onResponse(true); } }, transportService, false, listener); @@ -2142,12 +2118,12 @@ public void initRealtimeTaskCacheAndCleanupStaleCache( } } - private void recreateRealtimeTask(AnomalyDetectorFunction function, ActionListener listener) { - if (detectionIndices.doesDetectorStateIndexExist()) { + private void recreateRealtimeTask(ExecutorFunction function, ActionListener listener) { + if (detectionIndices.doesStateIndexExist()) { function.execute(); } else { // If detection index doesn't exist, create index and execute function. - detectionIndices.initDetectionStateIndex(ActionListener.wrap(r -> { + detectionIndices.initStateIndex(ActionListener.wrap(r -> { if (r.isAcknowledged()) { logger.info("Created {} with mappings.", DETECTION_STATE_INDEX); function.execute(); @@ -2206,7 +2182,7 @@ private void entityTaskDone( ADTask adTask, Exception exception, TransportService transportService, - ActionListener listener + ActionListener listener ) { try { ADTaskAction action = getAdEntityTaskAction(adTask, exception); @@ -2235,7 +2211,7 @@ private ADTaskAction getAdEntityTaskAction(ADTask adTask, Exception exception) { adTask.setError(getErrorMessage(exception)); if (exception instanceof LimitExceededException && isRetryableError(exception.getMessage())) { action = ADTaskAction.PUSH_BACK_ENTITY; - } else if (exception instanceof ADTaskCancelledException || exception instanceof EndRunException) { + } else if (exception instanceof TaskCancelledException || exception instanceof EndRunException) { action = ADTaskAction.CANCEL; } } @@ -2268,10 +2244,10 @@ public boolean isRetryableError(String error) { * @param state AD task state * @param listener action listener */ - public void setHCDetectorTaskDone(ADTask adTask, ADTaskState state, ActionListener listener) { - String detectorId = adTask.getDetectorId(); + public void setHCDetectorTaskDone(ADTask adTask, TaskState state, ActionListener listener) { + String detectorId = adTask.getConfigId(); String taskId = adTask.isEntityTask() ? adTask.getParentTaskId() : adTask.getTaskId(); - String detectorTaskId = adTask.getDetectorLevelTaskId(); + String detectorTaskId = adTask.getConfigLevelTaskId(); ActionListener wrappedListener = ActionListener.wrap(response -> { logger.info("Historical HC detector done with state: {}. Remove from cache, detector id:{}", state.name(), detectorId); @@ -2288,11 +2264,11 @@ public void setHCDetectorTaskDone(ADTask adTask, ADTaskState state, ActionListen }); long timeoutInMillis = 2000;// wait for 2 seconds to acquire updating HC detector task semaphore - if (state == ADTaskState.FINISHED) { - this.countEntityTasksByState(detectorTaskId, ImmutableList.of(ADTaskState.FINISHED), ActionListener.wrap(r -> { - logger.info("number of finished entity tasks: {}, for detector {}", r, adTask.getDetectorId()); + if (state == TaskState.FINISHED) { + this.countEntityTasksByState(detectorTaskId, ImmutableList.of(TaskState.FINISHED), ActionListener.wrap(r -> { + logger.info("number of finished entity tasks: {}, for detector {}", r, adTask.getConfigId()); // Set task as FAILED if no finished entity task; otherwise set as FINISHED - ADTaskState hcDetectorTaskState = r == 0 ? ADTaskState.FAILED : ADTaskState.FINISHED; + TaskState hcDetectorTaskState = r == 0 ? TaskState.FAILED : TaskState.FINISHED; // execute in AD batch task thread pool in case waiting for semaphore waste any shared OpenSearch thread pool threadPool.executor(AD_BATCH_TASK_THREAD_POOL_NAME).execute(() -> { updateADHCDetectorTask( @@ -2322,7 +2298,7 @@ public void setHCDetectorTaskDone(ADTask adTask, ADTaskState state, ActionListen ImmutableMap .of( STATE_FIELD, - ADTaskState.FAILED.name(),// set as FAILED if fail to get finished entity tasks. + TaskState.FAILED.name(),// set as FAILED if fail to get finished entity tasks. TASK_PROGRESS_FIELD, 1.0, ERROR_FIELD, @@ -2356,7 +2332,7 @@ public void setHCDetectorTaskDone(ADTask adTask, ADTaskState state, ActionListen } - listener.onResponse(new AnomalyDetectorJobResponse(taskId, 0, 0, 0, RestStatus.OK)); + listener.onResponse(new JobResponse(taskId)); } /** @@ -2366,7 +2342,7 @@ public void setHCDetectorTaskDone(ADTask adTask, ADTaskState state, ActionListen * @param taskStates task states * @param listener action listener */ - public void countEntityTasksByState(String detectorTaskId, List taskStates, ActionListener listener) { + public void countEntityTasksByState(String detectorTaskId, List taskStates, ActionListener listener) { BoolQueryBuilder queryBuilder = new BoolQueryBuilder(); queryBuilder.filter(new TermQueryBuilder(PARENT_TASK_ID_FIELD, detectorTaskId)); if (taskStates != null && taskStates.size() > 0) { @@ -2470,12 +2446,8 @@ private void updateADHCDetectorTask( * @param transportService transport service * @param listener action listener */ - public void runNextEntityForHCADHistorical( - ADTask adTask, - TransportService transportService, - ActionListener listener - ) { - String detectorId = adTask.getDetectorId(); + public void runNextEntityForHCADHistorical(ADTask adTask, TransportService transportService, ActionListener listener) { + String detectorId = adTask.getConfigId(); int scaleDelta = scaleTaskSlots(adTask, transportService, ActionListener.wrap(r -> { logger.debug("Scale up task slots done for detector {}, task {}", detectorId, adTask.getTaskId()); }, e -> { logger.error("Failed to scale up task slots for task " + adTask.getTaskId(), e); })); @@ -2487,7 +2459,7 @@ public void runNextEntityForHCADHistorical( adTask.getTaskId(), adTaskCacheManager.getDetectorTaskSlots(detectorId) ); - listener.onResponse(new AnomalyDetectorJobResponse(detectorId, 0, 0, 0, RestStatus.ACCEPTED)); + listener.onResponse(new JobResponse(detectorId)); return; } client.execute(ADBatchAnomalyResultAction.INSTANCE, new ADBatchAnomalyResultRequest(adTask), ActionListener.wrap(r -> { @@ -2500,7 +2472,7 @@ public void runNextEntityForHCADHistorical( remoteOrLocal, r.getNodeId() ); - AnomalyDetectorJobResponse anomalyDetectorJobResponse = new AnomalyDetectorJobResponse(detectorId, 0, 0, 0, RestStatus.OK); + JobResponse anomalyDetectorJobResponse = new JobResponse(detectorId); listener.onResponse(anomalyDetectorJobResponse); }, e -> { listener.onFailure(e); })); } @@ -2515,12 +2487,8 @@ public void runNextEntityForHCADHistorical( * @param scaleUpListener action listener * @return task slots scale delta */ - protected int scaleTaskSlots( - ADTask adTask, - TransportService transportService, - ActionListener scaleUpListener - ) { - String detectorId = adTask.getDetectorId(); + protected int scaleTaskSlots(ADTask adTask, TransportService transportService, ActionListener scaleUpListener) { + String detectorId = adTask.getConfigId(); if (!scaleEntityTaskLane.tryAcquire()) { logger.debug("Can't get scaleEntityTaskLane semaphore"); return 0; @@ -2759,22 +2727,22 @@ public synchronized void removeStaleRunningEntity( ADTask adTask, String entity, TransportService transportService, - ActionListener listener + ActionListener listener ) { - String detectorId = adTask.getDetectorId(); + String detectorId = adTask.getConfigId(); boolean removed = adTaskCacheManager.removeRunningEntity(detectorId, entity); if (removed && adTaskCacheManager.getPendingEntityCount(detectorId) > 0) { logger.debug("kick off next pending entities"); this.runNextEntityForHCADHistorical(adTask, transportService, listener); } else { if (!adTaskCacheManager.hasEntity(detectorId)) { - setHCDetectorTaskDone(adTask, ADTaskState.STOPPED, listener); + setHCDetectorTaskDone(adTask, TaskState.STOPPED, listener); } } } public boolean skipUpdateHCRealtimeTask(String detectorId, String error) { - ADRealtimeTaskCache realtimeTaskCache = adTaskCacheManager.getRealtimeTaskCache(detectorId); + RealtimeTaskCache realtimeTaskCache = adTaskCacheManager.getRealtimeTaskCache(detectorId); return realtimeTaskCache != null && realtimeTaskCache.getInitProgress() != null && realtimeTaskCache.getInitProgress().floatValue() == 1.0 @@ -2782,7 +2750,7 @@ public boolean skipUpdateHCRealtimeTask(String detectorId, String error) { } public boolean isHCRealtimeTaskStartInitializing(String detectorId) { - ADRealtimeTaskCache realtimeTaskCache = adTaskCacheManager.getRealtimeTaskCache(detectorId); + RealtimeTaskCache realtimeTaskCache = adTaskCacheManager.getRealtimeTaskCache(detectorId); return realtimeTaskCache != null && realtimeTaskCache.getInitProgress() != null && realtimeTaskCache.getInitProgress().floatValue() > 0; @@ -2803,18 +2771,18 @@ public String convertEntityToString(ADTask adTask) { * @return entity string value */ public String convertEntityToString(Entity entity, AnomalyDetector detector) { - if (detector.isMultiCategoryDetector()) { + if (detector.hasMultipleCategories()) { try { XContentBuilder builder = entity.toXContent(XContentFactory.jsonBuilder(), ToXContent.EMPTY_PARAMS); return BytesReference.bytes(builder).utf8ToString(); } catch (IOException e) { String error = "Failed to parse entity into string"; logger.debug(error, e); - throw new AnomalyDetectionException(error); + throw new TimeSeriesException(error); } } - if (detector.isMultientityDetector()) { - String categoryField = detector.getCategoryField().get(0); + if (detector.isHighCardinality()) { + String categoryField = detector.getCategoryFields().get(0); return entity.getAttributes().get(categoryField); } return null; @@ -2828,7 +2796,7 @@ public String convertEntityToString(Entity entity, AnomalyDetector detector) { */ public Entity parseEntityFromString(String entityValue, ADTask adTask) { AnomalyDetector detector = adTask.getDetector(); - if (detector.isMultiCategoryDetector()) { + if (detector.hasMultipleCategories()) { try { XContentParser parser = XContentType.JSON .xContent() @@ -2838,10 +2806,10 @@ public Entity parseEntityFromString(String entityValue, ADTask adTask) { } catch (IOException e) { String error = "Failed to parse string into entity"; logger.debug(error, e); - throw new AnomalyDetectionException(error); + throw new TimeSeriesException(error); } - } else if (detector.isMultientityDetector()) { - return Entity.createSingleAttributeEntity(detector.getCategoryField().get(0), entityValue); + } else if (detector.isHighCardinality()) { + return Entity.createSingleAttributeEntity(detector.getCategoryFields().get(0), entityValue); } throw new IllegalArgumentException("Fail to parse to Entity for single flow detector"); } @@ -3018,7 +2986,7 @@ private void maintainRunningHistoricalTask(ConcurrentLinkedQueue taskQue logger.debug("Finished maintaining running historical task {}", adTask.getTaskId()); maintainRunningHistoricalTask(taskQueue, transportService); }, transportService, ActionListener.wrap(r -> { - logger.debug("Reset historical task state done for task {}, detector {}", adTask.getTaskId(), adTask.getDetectorId()); + logger.debug("Reset historical task state done for task {}, detector {}", adTask.getTaskId(), adTask.getConfigId()); }, e -> { logger.error("Failed to reset historical task state for task " + adTask.getTaskId(), e); })); }, TimeValue.timeValueSeconds(DEFAULT_MAINTAIN_INTERVAL_IN_SECONDS), AD_BATCH_TASK_THREAD_POOL_NAME); } @@ -3034,7 +3002,7 @@ public void maintainRunningRealtimeTasks() { } for (int i = 0; i < detectorIds.length; i++) { String detectorId = detectorIds[i]; - ADRealtimeTaskCache taskCache = adTaskCacheManager.getRealtimeTaskCache(detectorId); + RealtimeTaskCache taskCache = adTaskCacheManager.getRealtimeTaskCache(detectorId); if (taskCache != null && taskCache.expired()) { adTaskCacheManager.removeRealtimeTaskCache(detectorId); } diff --git a/src/main/java/org/opensearch/ad/transport/ADBatchAnomalyResultAction.java b/src/main/java/org/opensearch/ad/transport/ADBatchAnomalyResultAction.java index 91d9afa3d..84fe0c6fe 100644 --- a/src/main/java/org/opensearch/ad/transport/ADBatchAnomalyResultAction.java +++ b/src/main/java/org/opensearch/ad/transport/ADBatchAnomalyResultAction.java @@ -11,7 +11,7 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonName.AD_TASK; +import static org.opensearch.ad.constant.ADCommonName.AD_TASK; import org.opensearch.action.ActionType; import org.opensearch.ad.constant.CommonValue; diff --git a/src/main/java/org/opensearch/ad/transport/ADBatchTaskRemoteExecutionAction.java b/src/main/java/org/opensearch/ad/transport/ADBatchTaskRemoteExecutionAction.java index 5eb0f8f5c..d865ec14c 100644 --- a/src/main/java/org/opensearch/ad/transport/ADBatchTaskRemoteExecutionAction.java +++ b/src/main/java/org/opensearch/ad/transport/ADBatchTaskRemoteExecutionAction.java @@ -11,7 +11,7 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonName.AD_TASK_REMOTE; +import static org.opensearch.ad.constant.ADCommonName.AD_TASK_REMOTE; import org.opensearch.action.ActionType; import org.opensearch.ad.constant.CommonValue; diff --git a/src/main/java/org/opensearch/ad/transport/ADCancelTaskAction.java b/src/main/java/org/opensearch/ad/transport/ADCancelTaskAction.java index f328caf1d..31f20fa00 100644 --- a/src/main/java/org/opensearch/ad/transport/ADCancelTaskAction.java +++ b/src/main/java/org/opensearch/ad/transport/ADCancelTaskAction.java @@ -11,7 +11,7 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonName.CANCEL_TASK; +import static org.opensearch.ad.constant.ADCommonName.CANCEL_TASK; import org.opensearch.action.ActionType; import org.opensearch.ad.constant.CommonValue; diff --git a/src/main/java/org/opensearch/ad/transport/ADCancelTaskNodeRequest.java b/src/main/java/org/opensearch/ad/transport/ADCancelTaskNodeRequest.java index df8829fe5..d26523e82 100644 --- a/src/main/java/org/opensearch/ad/transport/ADCancelTaskNodeRequest.java +++ b/src/main/java/org/opensearch/ad/transport/ADCancelTaskNodeRequest.java @@ -34,7 +34,7 @@ public ADCancelTaskNodeRequest(StreamInput in) throws IOException { } public ADCancelTaskNodeRequest(ADCancelTaskRequest request) { - this.detectorId = request.getDetectorId(); + this.detectorId = request.getId(); this.detectorTaskId = request.getDetectorTaskId(); this.userName = request.getUserName(); this.reason = request.getReason(); @@ -49,7 +49,7 @@ public void writeTo(StreamOutput out) throws IOException { out.writeOptionalString(reason); } - public String getDetectorId() { + public String getId() { return detectorId; } diff --git a/src/main/java/org/opensearch/ad/transport/ADCancelTaskRequest.java b/src/main/java/org/opensearch/ad/transport/ADCancelTaskRequest.java index b75e7ea54..ddfbd6a53 100644 --- a/src/main/java/org/opensearch/ad/transport/ADCancelTaskRequest.java +++ b/src/main/java/org/opensearch/ad/transport/ADCancelTaskRequest.java @@ -17,7 +17,7 @@ import org.opensearch.action.ActionRequestValidationException; import org.opensearch.action.support.nodes.BaseNodesRequest; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.core.common.Strings; import org.opensearch.core.common.io.stream.StreamInput; @@ -56,7 +56,7 @@ public ADCancelTaskRequest(String detectorId, String detectorTaskId, String user public ActionRequestValidationException validate() { ActionRequestValidationException validationException = null; if (Strings.isEmpty(detectorId)) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.AD_ID_MISSING_MSG, validationException); } return validationException; } @@ -70,7 +70,7 @@ public void writeTo(StreamOutput out) throws IOException { out.writeOptionalString(reason); } - public String getDetectorId() { + public String getId() { return detectorId; } diff --git a/src/main/java/org/opensearch/ad/transport/ADCancelTaskTransportAction.java b/src/main/java/org/opensearch/ad/transport/ADCancelTaskTransportAction.java index 696291558..03d0a5861 100644 --- a/src/main/java/org/opensearch/ad/transport/ADCancelTaskTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/ADCancelTaskTransportAction.java @@ -11,7 +11,7 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonErrorMessages.HISTORICAL_ANALYSIS_CANCELLED; +import static org.opensearch.ad.constant.ADCommonMessages.HISTORICAL_ANALYSIS_CANCELLED; import java.io.IOException; import java.util.List; @@ -79,11 +79,11 @@ protected ADCancelTaskNodeResponse newNodeResponse(StreamInput in) throws IOExce @Override protected ADCancelTaskNodeResponse nodeOperation(ADCancelTaskNodeRequest request) { String userName = request.getUserName(); - String detectorId = request.getDetectorId(); + String detectorId = request.getId(); String detectorTaskId = request.getDetectorTaskId(); String reason = Optional.ofNullable(request.getReason()).orElse(HISTORICAL_ANALYSIS_CANCELLED); ADTaskCancellationState state = adTaskManager.cancelLocalTaskByDetectorId(detectorId, detectorTaskId, reason, userName); - logger.debug("Cancelled AD task for detector: {}", request.getDetectorId()); + logger.debug("Cancelled AD task for detector: {}", request.getId()); return new ADCancelTaskNodeResponse(clusterService.localNode(), state); } } diff --git a/src/main/java/org/opensearch/ad/transport/ADResultBulkTransportAction.java b/src/main/java/org/opensearch/ad/transport/ADResultBulkTransportAction.java index ac9ca269c..03ca7657c 100644 --- a/src/main/java/org/opensearch/ad/transport/ADResultBulkTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/ADResultBulkTransportAction.java @@ -11,8 +11,8 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.INDEX_PRESSURE_HARD_LIMIT; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.INDEX_PRESSURE_SOFT_LIMIT; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_INDEX_PRESSURE_HARD_LIMIT; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_INDEX_PRESSURE_SOFT_LIMIT; import static org.opensearch.common.xcontent.XContentFactory.jsonBuilder; import static org.opensearch.index.IndexingPressure.MAX_INDEXING_BYTES; @@ -27,11 +27,10 @@ import org.opensearch.action.index.IndexRequest; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.ad.ratelimit.ResultWriteRequest; import org.opensearch.ad.util.BulkUtil; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.inject.Inject; @@ -41,6 +40,7 @@ import org.opensearch.index.IndexingPressure; import org.opensearch.tasks.Task; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.util.RestHandlerUtils; import org.opensearch.transport.TransportService; public class ADResultBulkTransportAction extends HandledTransportAction { @@ -66,12 +66,12 @@ public ADResultBulkTransportAction( super(ADResultBulkAction.NAME, transportService, actionFilters, ADResultBulkRequest::new, ThreadPool.Names.SAME); this.indexingPressure = indexingPressure; this.primaryAndCoordinatingLimits = MAX_INDEXING_BYTES.get(settings).getBytes(); - this.softLimit = INDEX_PRESSURE_SOFT_LIMIT.get(settings); - this.hardLimit = INDEX_PRESSURE_HARD_LIMIT.get(settings); - this.indexName = CommonName.ANOMALY_RESULT_INDEX_ALIAS; + this.softLimit = AD_INDEX_PRESSURE_SOFT_LIMIT.get(settings); + this.hardLimit = AD_INDEX_PRESSURE_HARD_LIMIT.get(settings); + this.indexName = ADCommonName.ANOMALY_RESULT_INDEX_ALIAS; this.client = client; - clusterService.getClusterSettings().addSettingsUpdateConsumer(INDEX_PRESSURE_SOFT_LIMIT, it -> softLimit = it); - clusterService.getClusterSettings().addSettingsUpdateConsumer(INDEX_PRESSURE_HARD_LIMIT, it -> hardLimit = it); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_INDEX_PRESSURE_SOFT_LIMIT, it -> softLimit = it); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_INDEX_PRESSURE_HARD_LIMIT, it -> hardLimit = it); // random seed is 42. Can be any number this.random = new Random(42); } @@ -94,7 +94,7 @@ protected void doExecute(Task task, ADResultBulkRequest request, ActionListener< if (indexingPressurePercent <= softLimit) { for (ResultWriteRequest resultWriteRequest : results) { - addResult(bulkRequest, resultWriteRequest.getResult(), resultWriteRequest.getResultIndex()); + addResult(bulkRequest, resultWriteRequest.getResult(), resultWriteRequest.getCustomResultIndex()); } } else if (indexingPressurePercent <= hardLimit) { // exceed soft limit (60%) but smaller than hard limit (90%) @@ -102,7 +102,7 @@ protected void doExecute(Task task, ADResultBulkRequest request, ActionListener< for (ResultWriteRequest resultWriteRequest : results) { AnomalyResult result = resultWriteRequest.getResult(); if (result.isHighPriority() || random.nextFloat() < acceptProbability) { - addResult(bulkRequest, result, resultWriteRequest.getResultIndex()); + addResult(bulkRequest, result, resultWriteRequest.getCustomResultIndex()); } } } else { @@ -110,7 +110,7 @@ protected void doExecute(Task task, ADResultBulkRequest request, ActionListener< for (ResultWriteRequest resultWriteRequest : results) { AnomalyResult result = resultWriteRequest.getResult(); if (result.isHighPriority()) { - addResult(bulkRequest, result, resultWriteRequest.getResultIndex()); + addResult(bulkRequest, result, resultWriteRequest.getCustomResultIndex()); } } } diff --git a/src/main/java/org/opensearch/ad/transport/ADTaskProfileAction.java b/src/main/java/org/opensearch/ad/transport/ADTaskProfileAction.java index 2781cf720..f2b198d1c 100644 --- a/src/main/java/org/opensearch/ad/transport/ADTaskProfileAction.java +++ b/src/main/java/org/opensearch/ad/transport/ADTaskProfileAction.java @@ -11,7 +11,7 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonName.AD_TASK; +import static org.opensearch.ad.constant.ADCommonName.AD_TASK; import org.opensearch.action.ActionType; import org.opensearch.ad.constant.CommonValue; diff --git a/src/main/java/org/opensearch/ad/transport/ADTaskProfileNodeRequest.java b/src/main/java/org/opensearch/ad/transport/ADTaskProfileNodeRequest.java index e4b09f37f..8391dfe67 100644 --- a/src/main/java/org/opensearch/ad/transport/ADTaskProfileNodeRequest.java +++ b/src/main/java/org/opensearch/ad/transport/ADTaskProfileNodeRequest.java @@ -26,7 +26,7 @@ public ADTaskProfileNodeRequest(StreamInput in) throws IOException { } public ADTaskProfileNodeRequest(ADTaskProfileRequest request) { - this.detectorId = request.getDetectorId(); + this.detectorId = request.getId(); } @Override @@ -35,7 +35,7 @@ public void writeTo(StreamOutput out) throws IOException { out.writeString(detectorId); } - public String getDetectorId() { + public String getId() { return detectorId; } diff --git a/src/main/java/org/opensearch/ad/transport/ADTaskProfileNodeResponse.java b/src/main/java/org/opensearch/ad/transport/ADTaskProfileNodeResponse.java index 59cdf9748..363e70be2 100644 --- a/src/main/java/org/opensearch/ad/transport/ADTaskProfileNodeResponse.java +++ b/src/main/java/org/opensearch/ad/transport/ADTaskProfileNodeResponse.java @@ -17,7 +17,6 @@ import org.apache.logging.log4j.Logger; import org.opensearch.Version; import org.opensearch.action.support.nodes.BaseNodeResponse; -import org.opensearch.ad.cluster.ADVersionUtil; import org.opensearch.ad.model.ADTaskProfile; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.core.common.io.stream.StreamInput; @@ -50,8 +49,7 @@ public ADTaskProfile getAdTaskProfile() { @Override public void writeTo(StreamOutput out) throws IOException { super.writeTo(out); - if (adTaskProfile != null - && (ADVersionUtil.compatibleWithVersionOnOrAfter1_1(remoteAdVersion) || adTaskProfile.getNodeId() != null)) { + if (adTaskProfile != null && (remoteAdVersion != null || adTaskProfile.getNodeId() != null)) { out.writeBoolean(true); adTaskProfile.writeTo(out, remoteAdVersion); } else { diff --git a/src/main/java/org/opensearch/ad/transport/ADTaskProfileRequest.java b/src/main/java/org/opensearch/ad/transport/ADTaskProfileRequest.java index 8ca4e6ff5..7b078d215 100644 --- a/src/main/java/org/opensearch/ad/transport/ADTaskProfileRequest.java +++ b/src/main/java/org/opensearch/ad/transport/ADTaskProfileRequest.java @@ -17,7 +17,7 @@ import org.opensearch.action.ActionRequestValidationException; import org.opensearch.action.support.nodes.BaseNodesRequest; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.core.common.Strings; import org.opensearch.core.common.io.stream.StreamInput; @@ -41,7 +41,7 @@ public ADTaskProfileRequest(String detectorId, DiscoveryNode... nodes) { public ActionRequestValidationException validate() { ActionRequestValidationException validationException = null; if (Strings.isEmpty(detectorId)) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.AD_ID_MISSING_MSG, validationException); } return validationException; } @@ -52,7 +52,7 @@ public void writeTo(StreamOutput out) throws IOException { out.writeString(detectorId); } - public String getDetectorId() { + public String getId() { return detectorId; } } diff --git a/src/main/java/org/opensearch/ad/transport/ADTaskProfileTransportAction.java b/src/main/java/org/opensearch/ad/transport/ADTaskProfileTransportAction.java index 58407e455..6902d6de8 100644 --- a/src/main/java/org/opensearch/ad/transport/ADTaskProfileTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/ADTaskProfileTransportAction.java @@ -80,7 +80,7 @@ protected ADTaskProfileNodeResponse newNodeResponse(StreamInput in) throws IOExc protected ADTaskProfileNodeResponse nodeOperation(ADTaskProfileNodeRequest request) { String remoteNodeId = request.getParentTask().getNodeId(); Version remoteAdVersion = hashRing.getAdVersion(remoteNodeId); - ADTaskProfile adTaskProfile = adTaskManager.getLocalADTaskProfilesByDetectorId(request.getDetectorId()); + ADTaskProfile adTaskProfile = adTaskManager.getLocalADTaskProfilesByDetectorId(request.getId()); return new ADTaskProfileNodeResponse(clusterService.localNode(), adTaskProfile, remoteAdVersion); } } diff --git a/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobAction.java b/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobAction.java index b11283181..83ea58960 100644 --- a/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobAction.java +++ b/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobAction.java @@ -13,14 +13,15 @@ import org.opensearch.action.ActionType; import org.opensearch.ad.constant.CommonValue; +import org.opensearch.timeseries.transport.JobResponse; -public class AnomalyDetectorJobAction extends ActionType { +public class AnomalyDetectorJobAction extends ActionType { // External Action which used for public facing RestAPIs. public static final String NAME = CommonValue.EXTERNAL_ACTION_PREFIX + "detector/jobmanagement"; public static final AnomalyDetectorJobAction INSTANCE = new AnomalyDetectorJobAction(); private AnomalyDetectorJobAction() { - super(NAME, AnomalyDetectorJobResponse::new); + super(NAME, JobResponse::new); } } diff --git a/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobRequest.java b/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobRequest.java index 5656b2f27..3a62315a6 100644 --- a/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobRequest.java +++ b/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobRequest.java @@ -15,14 +15,14 @@ import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.model.DetectionDateRange; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; +import org.opensearch.timeseries.model.DateRange; public class AnomalyDetectorJobRequest extends ActionRequest { private String detectorID; - private DetectionDateRange detectionDateRange; + private DateRange detectionDateRange; private boolean historical; private long seqNo; private long primaryTerm; @@ -35,7 +35,7 @@ public AnomalyDetectorJobRequest(StreamInput in) throws IOException { primaryTerm = in.readLong(); rawPath = in.readString(); if (in.readBoolean()) { - detectionDateRange = new DetectionDateRange(in); + detectionDateRange = new DateRange(in); } historical = in.readBoolean(); } @@ -61,7 +61,7 @@ public AnomalyDetectorJobRequest(String detectorID, long seqNo, long primaryTerm */ public AnomalyDetectorJobRequest( String detectorID, - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, boolean historical, long seqNo, long primaryTerm, @@ -80,7 +80,7 @@ public String getDetectorID() { return detectorID; } - public DetectionDateRange getDetectionDateRange() { + public DateRange getDetectionDateRange() { return detectionDateRange; } diff --git a/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobResponse.java b/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobResponse.java deleted file mode 100644 index 508762cf1..000000000 --- a/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobResponse.java +++ /dev/null @@ -1,71 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.transport; - -import java.io.IOException; - -import org.opensearch.ad.util.RestHandlerUtils; -import org.opensearch.core.action.ActionResponse; -import org.opensearch.core.common.io.stream.StreamInput; -import org.opensearch.core.common.io.stream.StreamOutput; -import org.opensearch.core.rest.RestStatus; -import org.opensearch.core.xcontent.ToXContentObject; -import org.opensearch.core.xcontent.XContentBuilder; - -public class AnomalyDetectorJobResponse extends ActionResponse implements ToXContentObject { - private final String id; - private final long version; - private final long seqNo; - private final long primaryTerm; - private final RestStatus restStatus; - - public AnomalyDetectorJobResponse(StreamInput in) throws IOException { - super(in); - id = in.readString(); - version = in.readLong(); - seqNo = in.readLong(); - primaryTerm = in.readLong(); - restStatus = in.readEnum(RestStatus.class); - } - - public AnomalyDetectorJobResponse(String id, long version, long seqNo, long primaryTerm, RestStatus restStatus) { - this.id = id; - this.version = version; - this.seqNo = seqNo; - this.primaryTerm = primaryTerm; - this.restStatus = restStatus; - } - - public String getId() { - return id; - } - - @Override - public void writeTo(StreamOutput out) throws IOException { - out.writeString(id); - out.writeLong(version); - out.writeLong(seqNo); - out.writeLong(primaryTerm); - out.writeEnum(restStatus); - } - - @Override - public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { - return builder - .startObject() - .field(RestHandlerUtils._ID, id) - .field(RestHandlerUtils._VERSION, version) - .field(RestHandlerUtils._SEQ_NO, seqNo) - .field(RestHandlerUtils._PRIMARY_TERM, primaryTerm) - .endObject(); - } -} diff --git a/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobTransportAction.java b/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobTransportAction.java index fe2307437..2ffb8b85a 100644 --- a/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/AnomalyDetectorJobTransportAction.java @@ -11,24 +11,23 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_START_DETECTOR; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_STOP_DETECTOR; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.REQUEST_TIMEOUT; -import static org.opensearch.ad.util.ParseUtils.getUserContext; -import static org.opensearch.ad.util.ParseUtils.resolveUserAndExecute; -import static org.opensearch.ad.util.RestHandlerUtils.wrapRestActionListener; +import static org.opensearch.ad.constant.ADCommonMessages.FAIL_TO_START_DETECTOR; +import static org.opensearch.ad.constant.ADCommonMessages.FAIL_TO_STOP_DETECTOR; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_REQUEST_TIMEOUT; +import static org.opensearch.timeseries.util.ParseUtils.getUserContext; +import static org.opensearch.timeseries.util.ParseUtils.resolveUserAndExecute; +import static org.opensearch.timeseries.util.RestHandlerUtils.wrapRestActionListener; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; import org.opensearch.ad.ExecuteADResultResponseRecorder; -import org.opensearch.ad.indices.AnomalyDetectionIndices; -import org.opensearch.ad.model.DetectionDateRange; +import org.opensearch.ad.indices.ADIndexManagement; +import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.rest.handler.IndexAnomalyDetectorJobActionHandler; import org.opensearch.ad.task.ADTaskManager; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.inject.Inject; @@ -39,15 +38,18 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.core.xcontent.NamedXContentRegistry; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.transport.JobResponse; +import org.opensearch.timeseries.util.RestHandlerUtils; import org.opensearch.transport.TransportService; -public class AnomalyDetectorJobTransportAction extends HandledTransportAction { +public class AnomalyDetectorJobTransportAction extends HandledTransportAction { private final Logger logger = LogManager.getLogger(AnomalyDetectorJobTransportAction.class); private final Client client; private final ClusterService clusterService; private final Settings settings; - private final AnomalyDetectionIndices anomalyDetectionIndices; + private final ADIndexManagement anomalyDetectionIndices; private final NamedXContentRegistry xContentRegistry; private volatile Boolean filterByEnabled; private final ADTaskManager adTaskManager; @@ -61,7 +63,7 @@ public AnomalyDetectorJobTransportAction( Client client, ClusterService clusterService, Settings settings, - AnomalyDetectionIndices anomalyDetectionIndices, + ADIndexManagement anomalyDetectionIndices, NamedXContentRegistry xContentRegistry, ADTaskManager adTaskManager, ExecuteADResultResponseRecorder recorder @@ -74,22 +76,22 @@ public AnomalyDetectorJobTransportAction( this.anomalyDetectionIndices = anomalyDetectionIndices; this.xContentRegistry = xContentRegistry; this.adTaskManager = adTaskManager; - filterByEnabled = FILTER_BY_BACKEND_ROLES.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); + filterByEnabled = AD_FILTER_BY_BACKEND_ROLES.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); this.recorder = recorder; } @Override - protected void doExecute(Task task, AnomalyDetectorJobRequest request, ActionListener actionListener) { + protected void doExecute(Task task, AnomalyDetectorJobRequest request, ActionListener actionListener) { String detectorId = request.getDetectorID(); - DetectionDateRange detectionDateRange = request.getDetectionDateRange(); + DateRange detectionDateRange = request.getDetectionDateRange(); boolean historical = request.isHistorical(); long seqNo = request.getSeqNo(); long primaryTerm = request.getPrimaryTerm(); String rawPath = request.getRawPath(); - TimeValue requestTimeout = REQUEST_TIMEOUT.get(settings); + TimeValue requestTimeout = AD_REQUEST_TIMEOUT.get(settings); String errorMessage = rawPath.endsWith(RestHandlerUtils.START_JOB) ? FAIL_TO_START_DETECTOR : FAIL_TO_STOP_DETECTOR; - ActionListener listener = wrapRestActionListener(actionListener, errorMessage); + ActionListener listener = wrapRestActionListener(actionListener, errorMessage); // By the time request reaches here, the user permissions are validated by Security plugin. User user = getUserContext(client); @@ -113,7 +115,8 @@ protected void doExecute(Task task, AnomalyDetectorJobRequest request, ActionLis ), client, clusterService, - xContentRegistry + xContentRegistry, + AnomalyDetector.class ); } catch (Exception e) { logger.error(e); @@ -122,9 +125,9 @@ protected void doExecute(Task task, AnomalyDetectorJobRequest request, ActionLis } private void executeDetector( - ActionListener listener, + ActionListener listener, String detectorId, - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, boolean historical, long seqNo, long primaryTerm, diff --git a/src/main/java/org/opensearch/ad/transport/AnomalyResultRequest.java b/src/main/java/org/opensearch/ad/transport/AnomalyResultRequest.java index 785b6fdb6..e6f788aeb 100644 --- a/src/main/java/org/opensearch/ad/transport/AnomalyResultRequest.java +++ b/src/main/java/org/opensearch/ad/transport/AnomalyResultRequest.java @@ -20,8 +20,8 @@ import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.common.Strings; import org.opensearch.core.common.io.stream.InputStreamStreamInput; import org.opensearch.core.common.io.stream.OutputStreamStreamOutput; @@ -29,6 +29,8 @@ import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; public class AnomalyResultRequest extends ActionRequest implements ToXContentObject { private String adID; @@ -74,11 +76,11 @@ public void writeTo(StreamOutput out) throws IOException { public ActionRequestValidationException validate() { ActionRequestValidationException validationException = null; if (Strings.isEmpty(adID)) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.AD_ID_MISSING_MSG, validationException); } if (start <= 0 || end <= 0 || start > end) { validationException = addValidationError( - String.format(Locale.ROOT, "%s: start %d, end %d", CommonErrorMessages.INVALID_TIMESTAMP_ERR_MSG, start, end), + String.format(Locale.ROOT, "%s: start %d, end %d", CommonMessages.INVALID_TIMESTAMP_ERR_MSG, start, end), validationException ); } @@ -88,7 +90,7 @@ public ActionRequestValidationException validate() { @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { builder.startObject(); - builder.field(CommonName.ID_JSON_KEY, adID); + builder.field(ADCommonName.ID_JSON_KEY, adID); builder.field(CommonName.START_JSON_KEY, start); builder.field(CommonName.END_JSON_KEY, end); builder.endObject(); diff --git a/src/main/java/org/opensearch/ad/transport/AnomalyResultResponse.java b/src/main/java/org/opensearch/ad/transport/AnomalyResultResponse.java index 4b844ac27..67113d3af 100644 --- a/src/main/java/org/opensearch/ad/transport/AnomalyResultResponse.java +++ b/src/main/java/org/opensearch/ad/transport/AnomalyResultResponse.java @@ -18,9 +18,9 @@ import java.time.Instant; import java.util.ArrayList; import java.util.List; +import java.util.Optional; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.model.FeatureData; import org.opensearch.commons.authuser.User; import org.opensearch.core.action.ActionResponse; import org.opensearch.core.common.io.stream.InputStreamStreamInput; @@ -29,6 +29,7 @@ import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.model.FeatureData; public class AnomalyResultResponse extends ActionResponse implements ToXContentObject { public static final String ANOMALY_GRADE_JSON_KEY = "anomalyGrade"; @@ -196,7 +197,7 @@ public Long getRcfTotalUpdates() { return rcfTotalUpdates; } - public Long getDetectorIntervalInMinutes() { + public Long getIntervalInMinutes() { return detectorIntervalInMinutes; } @@ -360,7 +361,7 @@ public AnomalyResult toAnomalyResult( executionStartInstant, executionEndInstant, error, - null, + Optional.empty(), user, schemaVersion, null, // single-stream real-time has no model id diff --git a/src/main/java/org/opensearch/ad/transport/AnomalyResultTransportAction.java b/src/main/java/org/opensearch/ad/transport/AnomalyResultTransportAction.java index ef030d219..084db7f42 100644 --- a/src/main/java/org/opensearch/ad/transport/AnomalyResultTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/AnomalyResultTransportAction.java @@ -11,9 +11,9 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonErrorMessages.INVALID_SEARCH_QUERY_MSG; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_ENTITIES_PER_QUERY; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.PAGE_SIZE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_MAX_ENTITIES_PER_QUERY; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_PAGE_SIZE; +import static org.opensearch.timeseries.constant.CommonMessages.INVALID_SEARCH_QUERY_MSG; import java.net.ConnectException; import java.util.ArrayList; @@ -42,37 +42,19 @@ import org.opensearch.action.support.IndicesOptions; import org.opensearch.action.support.ThreadedActionListener; import org.opensearch.action.support.master.AcknowledgedResponse; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.cluster.HashRing; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.ClientException; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.common.exception.InternalFailure; -import org.opensearch.ad.common.exception.LimitExceededException; -import org.opensearch.ad.common.exception.NotSerializedADExceptionName; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.feature.CompositeRetriever; import org.opensearch.ad.feature.CompositeRetriever.PageIterator; import org.opensearch.ad.feature.FeatureManager; import org.opensearch.ad.feature.SinglePointFeatures; import org.opensearch.ad.ml.ModelManager; -import org.opensearch.ad.ml.SingleStreamModelIdMapper; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.FeatureData; -import org.opensearch.ad.model.IntervalTimeConfiguration; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.settings.EnabledSetting; import org.opensearch.ad.stats.ADStats; -import org.opensearch.ad.stats.StatNames; import org.opensearch.ad.task.ADTaskManager; -import org.opensearch.ad.util.ExceptionUtil; -import org.opensearch.ad.util.ParseUtils; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.ClusterState; import org.opensearch.cluster.block.ClusterBlockLevel; @@ -93,6 +75,28 @@ import org.opensearch.node.NodeClosedException; import org.opensearch.tasks.Task; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.ClientException; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.InternalFailure; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.common.exception.NotSerializedExceptionName; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.ml.SingleStreamModelIdMapper; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.FeatureData; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.stats.StatNames; +import org.opensearch.timeseries.util.ExceptionUtil; +import org.opensearch.timeseries.util.ParseUtils; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.ActionNotFoundTransportException; import org.opensearch.transport.ConnectTransportException; import org.opensearch.transport.NodeNotConnectedException; @@ -121,7 +125,7 @@ public class AnomalyResultTransportAction extends HandledTransportAction(); this.xContentRegistry = xContentRegistry; - this.intervalRatioForRequest = AnomalyDetectorSettings.INTERVAL_RATIO_FOR_REQUESTS; + this.intervalRatioForRequest = TimeSeriesSettings.INTERVAL_RATIO_FOR_REQUESTS; - this.maxEntitiesPerInterval = MAX_ENTITIES_PER_QUERY.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(MAX_ENTITIES_PER_QUERY, it -> maxEntitiesPerInterval = it); + this.maxEntitiesPerInterval = AD_MAX_ENTITIES_PER_QUERY.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_MAX_ENTITIES_PER_QUERY, it -> maxEntitiesPerInterval = it); - this.pageSize = PAGE_SIZE.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(PAGE_SIZE, it -> pageSize = it); + this.pageSize = AD_PAGE_SIZE.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_PAGE_SIZE, it -> pageSize = it); this.adTaskManager = adTaskManager; } @@ -253,7 +257,7 @@ protected void doExecute(Task task, ActionRequest actionRequest, ActionListener< }, e -> { // If exception is AnomalyDetectionException and it should not be counted in stats, // we will not count it in failure stats. - if (!(e instanceof AnomalyDetectionException) || ((AnomalyDetectionException) e).isCountedInStats()) { + if (!(e instanceof TimeSeriesException) || ((TimeSeriesException) e).isCountedInStats()) { adStats.getStat(StatNames.AD_EXECUTE_FAIL_COUNT.getName()).increment(); if (hcDetectors.contains(adID)) { adStats.getStat(StatNames.AD_HC_EXECUTE_FAIL_COUNT.getName()).increment(); @@ -263,18 +267,18 @@ protected void doExecute(Task task, ActionRequest actionRequest, ActionListener< original.onFailure(e); }); - if (!EnabledSetting.isADPluginEnabled()) { - throw new EndRunException(adID, CommonErrorMessages.DISABLED_ERR_MSG, true).countedInStats(false); + if (!ADEnabledSetting.isADEnabled()) { + throw new EndRunException(adID, ADCommonMessages.DISABLED_ERR_MSG, true).countedInStats(false); } adStats.getStat(StatNames.AD_EXECUTE_REQUEST_COUNT.getName()).increment(); if (adCircuitBreakerService.isOpen()) { - listener.onFailure(new LimitExceededException(adID, CommonErrorMessages.MEMORY_CIRCUIT_BROKEN_ERR_MSG, false)); + listener.onFailure(new LimitExceededException(adID, CommonMessages.MEMORY_CIRCUIT_BROKEN_ERR_MSG, false)); return; } try { - stateManager.getAnomalyDetector(adID, onGetDetector(listener, adID, request)); + stateManager.getConfig(adID, AnalysisType.AD, onGetDetector(listener, adID, request)); } catch (Exception ex) { handleExecuteException(ex, listener, adID); } @@ -310,7 +314,7 @@ public void onResponse(CompositeRetriever.Page entityFeatures) { } if (entityFeatures != null && false == entityFeatures.isEmpty()) { // wrap expensive operation inside ad threadpool - threadPool.executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME).execute(() -> { + threadPool.executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME).execute(() -> { try { Set>> node2Entities = entityFeatures @@ -381,7 +385,7 @@ public void onFailure(Exception e) { private void handleException(Exception e) { Exception convertedException = convertedQueryFailureException(e, detectorId); - if (false == (convertedException instanceof AnomalyDetectionException)) { + if (false == (convertedException instanceof TimeSeriesException)) { Throwable cause = ExceptionsHelper.unwrapCause(convertedException); convertedException = new InternalFailure(detectorId, cause); } @@ -389,7 +393,7 @@ private void handleException(Exception e) { } } - private ActionListener> onGetDetector( + private ActionListener> onGetDetector( ActionListener listener, String adID, AnomalyResultRequest request @@ -400,8 +404,8 @@ private ActionListener> onGetDetector( return; } - AnomalyDetector anomalyDetector = detectorOptional.get(); - if (anomalyDetector.isMultientityDetector()) { + AnomalyDetector anomalyDetector = (AnomalyDetector) detectorOptional.get(); + if (anomalyDetector.isHighCardinality()) { hcDetectors.add(adID); adStats.getStat(StatNames.AD_HC_EXECUTE_REQUEST_COUNT.getName()).increment(); } @@ -444,7 +448,7 @@ private void executeAnomalyDetection( long dataEndTime ) { // HC logic starts here - if (anomalyDetector.isMultientityDetector()) { + if (anomalyDetector.isHighCardinality()) { Optional previousException = stateManager.fetchExceptionAndClear(adID); if (previousException.isPresent()) { Exception exception = previousException.get(); @@ -459,8 +463,7 @@ private void executeAnomalyDetection( } // assume request are in epoch milliseconds - long nextDetectionStartTime = request.getEnd() + (long) (anomalyDetector.getDetectorIntervalInMilliseconds() - * intervalRatioForRequest); + long nextDetectionStartTime = request.getEnd() + (long) (anomalyDetector.getIntervalInMilliseconds() * intervalRatioForRequest); CompositeRetriever compositeRetriever = new CompositeRetriever( dataStartTime, @@ -482,10 +485,7 @@ private void executeAnomalyDetection( try { pageIterator = compositeRetriever.iterator(); } catch (Exception e) { - listener - .onFailure( - new EndRunException(anomalyDetector.getDetectorId(), CommonErrorMessages.INVALID_SEARCH_QUERY_MSG, e, false) - ); + listener.onFailure(new EndRunException(anomalyDetector.getId(), CommonMessages.INVALID_SEARCH_QUERY_MSG, e, false)); return; } @@ -502,13 +502,7 @@ private void executeAnomalyDetection( } else { listener .onResponse( - new AnomalyResultResponse( - new ArrayList(), - null, - null, - anomalyDetector.getDetectorIntervalInMinutes(), - true - ) + new AnomalyResultResponse(new ArrayList(), null, null, anomalyDetector.getIntervalInMinutes(), true) ); } return; @@ -526,6 +520,7 @@ private void executeAnomalyDetection( DiscoveryNode rcfNode = asRCFNode.get(); + // we have already returned listener inside shouldStart method if (!shouldStart(listener, adID, anomalyDetector, rcfNode.getId(), rcfModelID)) { return; } @@ -686,7 +681,7 @@ private Exception coldStartIfNoModel(AtomicReference failure, Anomaly } // fetch previous cold start exception - String adID = detector.getDetectorId(); + String adID = detector.getId(); final Optional previousException = stateManager.fetchExceptionAndClear(adID); if (previousException.isPresent()) { Exception exception = previousException.get(); @@ -695,9 +690,9 @@ private Exception coldStartIfNoModel(AtomicReference failure, Anomaly return exception; } } - LOG.info("Trigger cold start for {}", detector.getDetectorId()); + LOG.info("Trigger cold start for {}", detector.getId()); coldStart(detector); - return previousException.orElse(new InternalFailure(adID, CommonErrorMessages.NO_MODEL_ERR_MSG)); + return previousException.orElse(new InternalFailure(adID, ADCommonMessages.NO_MODEL_ERR_MSG)); } private void findException(Throwable cause, String adID, AtomicReference failure, String nodeId) { @@ -713,14 +708,14 @@ private void findException(Throwable cause, String adID, AtomicReference actualException = NotSerializedADExceptionName - .convertWrappedAnomalyDetectionException((NotSerializableExceptionWrapper) causeException, adID); + Optional actualException = NotSerializedExceptionName + .convertWrappedTimeSeriesException((NotSerializableExceptionWrapper) causeException, adID); if (actualException.isPresent()) { - AnomalyDetectionException adException = actualException.get(); + TimeSeriesException adException = actualException.get(); failure.set(adException); if (adException instanceof ResourceNotFoundException) { // During a rolling upgrade or blue/green deployment, ResourceNotFoundException might be caused by old node using RCF @@ -731,10 +726,10 @@ private void findException(Throwable cause, String adID, AtomicReference listener, String adID) { if (ex instanceof ClientException) { listener.onFailure(ex); - } else if (ex instanceof AnomalyDetectionException) { - listener.onFailure(new InternalFailure((AnomalyDetectionException) ex)); + } else if (ex instanceof TimeSeriesException) { + listener.onFailure(new InternalFailure((TimeSeriesException) ex)); } else { Throwable cause = ExceptionsHelper.unwrapCause(ex); listener.onFailure(new InternalFailure(adID, cause)); @@ -816,7 +811,7 @@ public void onResponse(RCFResultResponse response) { featureInResponse, null, response.getTotalUpdates(), - detector.getDetectorIntervalInMinutes(), + detector.getIntervalInMinutes(), false, response.getRelativeIndex(), response.getAttribution(), @@ -828,7 +823,7 @@ public void onResponse(RCFResultResponse response) { ); } else { LOG.warn(NULL_RESPONSE + " {} for {}", modelID, rcfNodeID); - listener.onFailure(new InternalFailure(adID, CommonErrorMessages.NO_MODEL_ERR_MSG)); + listener.onFailure(new InternalFailure(adID, ADCommonMessages.NO_MODEL_ERR_MSG)); } } catch (Exception ex) { LOG.error(new ParameterizedMessage("Unexpected exception for [{}]", adID), ex); @@ -977,7 +972,7 @@ private boolean shouldStart( } private void coldStart(AnomalyDetector detector) { - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); // If last cold start is not finished, we don't trigger another one if (stateManager.isColdStartRunning(detectorId)) { @@ -992,7 +987,7 @@ private void coldStart(AnomalyDetector detector) { ActionListener trainModelListener = ActionListener .wrap(res -> { LOG.info("Succeeded in training {}", detectorId); }, exception -> { - if (exception instanceof AnomalyDetectionException) { + if (exception instanceof TimeSeriesException) { // e.g., partitioned model exceeds memory limit stateManager.setException(detectorId, exception); } else if (exception instanceof IllegalArgumentException) { @@ -1015,7 +1010,13 @@ private void coldStart(AnomalyDetector detector) { .trainModel( detector, dataPoints, - new ThreadedActionListener<>(LOG, threadPool, AnomalyDetectorPlugin.AD_THREAD_POOL_NAME, trainModelListener, false) + new ThreadedActionListener<>( + LOG, + threadPool, + TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME, + trainModelListener, + false + ) ); } else { stateManager.setException(detectorId, new EndRunException(detectorId, "Cannot get training data", false)); @@ -1023,7 +1024,7 @@ private void coldStart(AnomalyDetector detector) { }, exception -> { if (exception instanceof OpenSearchTimeoutException) { stateManager.setException(detectorId, new InternalFailure(detectorId, "Time out while getting training data", exception)); - } else if (exception instanceof AnomalyDetectionException) { + } else if (exception instanceof TimeSeriesException) { // e.g., Invalid search query stateManager.setException(detectorId, exception); } else { @@ -1035,7 +1036,7 @@ private void coldStart(AnomalyDetector detector) { .runAfter(listener, coldStartFinishingCallback::close); threadPool - .executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME) + .executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME) .execute( () -> featureManager .getColdStartData( @@ -1043,7 +1044,7 @@ private void coldStart(AnomalyDetector detector) { new ThreadedActionListener<>( LOG, threadPool, - AnomalyDetectorPlugin.AD_THREAD_POOL_NAME, + TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME, listenerWithReleaseCallback, false ) @@ -1058,7 +1059,7 @@ private void coldStart(AnomalyDetector detector) { * @return previous cold start exception */ private Optional coldStartIfNoCheckPoint(AnomalyDetector detector) { - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); Optional previousException = stateManager.fetchExceptionAndClear(detectorId); @@ -1083,7 +1084,7 @@ private Optional coldStartIfNoCheckPoint(AnomalyDetector detector) { } else { String errorMsg = String.format(Locale.ROOT, "Fail to get checkpoint state for %s", detectorId); LOG.error(errorMsg, exception); - stateManager.setException(detectorId, new AnomalyDetectionException(errorMsg, exception)); + stateManager.setException(detectorId, new TimeSeriesException(errorMsg, exception)); } })); diff --git a/src/main/java/org/opensearch/ad/transport/CronTransportAction.java b/src/main/java/org/opensearch/ad/transport/CronTransportAction.java index 43196ac1b..82075d035 100644 --- a/src/main/java/org/opensearch/ad/transport/CronTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/CronTransportAction.java @@ -19,7 +19,6 @@ import org.opensearch.action.FailedNodeException; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.nodes.TransportNodesAction; -import org.opensearch.ad.NodeStateManager; import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.feature.FeatureManager; import org.opensearch.ad.ml.EntityColdStarter; @@ -30,6 +29,7 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.NodeStateManager; import org.opensearch.transport.TransportService; public class CronTransportAction extends TransportNodesAction { diff --git a/src/main/java/org/opensearch/ad/transport/DeleteAnomalyDetectorRequest.java b/src/main/java/org/opensearch/ad/transport/DeleteAnomalyDetectorRequest.java index 9302a0acc..f87b6e0a1 100644 --- a/src/main/java/org/opensearch/ad/transport/DeleteAnomalyDetectorRequest.java +++ b/src/main/java/org/opensearch/ad/transport/DeleteAnomalyDetectorRequest.java @@ -17,7 +17,7 @@ import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.core.common.Strings; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; @@ -50,7 +50,7 @@ public void writeTo(StreamOutput out) throws IOException { public ActionRequestValidationException validate() { ActionRequestValidationException validationException = null; if (Strings.isEmpty(detectorID)) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.AD_ID_MISSING_MSG, validationException); } return validationException; } diff --git a/src/main/java/org/opensearch/ad/transport/DeleteAnomalyDetectorTransportAction.java b/src/main/java/org/opensearch/ad/transport/DeleteAnomalyDetectorTransportAction.java index e3d489e72..221a935bc 100644 --- a/src/main/java/org/opensearch/ad/transport/DeleteAnomalyDetectorTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/DeleteAnomalyDetectorTransportAction.java @@ -11,14 +11,13 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_DELETE_DETECTOR; +import static org.opensearch.ad.constant.ADCommonMessages.FAIL_TO_DELETE_DETECTOR; import static org.opensearch.ad.model.ADTaskType.HISTORICAL_DETECTOR_TASK_TYPES; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES; -import static org.opensearch.ad.util.ParseUtils.getUserContext; -import static org.opensearch.ad.util.ParseUtils.resolveUserAndExecute; -import static org.opensearch.ad.util.RestHandlerUtils.wrapRestActionListener; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.util.ParseUtils.getUserContext; +import static org.opensearch.timeseries.util.ParseUtils.resolveUserAndExecute; +import static org.opensearch.timeseries.util.RestHandlerUtils.wrapRestActionListener; import java.io.IOException; @@ -33,13 +32,10 @@ import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; import org.opensearch.action.support.WriteRequest; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.rest.handler.AnomalyDetectorFunction; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.task.ADTaskManager; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.inject.Inject; @@ -52,6 +48,10 @@ import org.opensearch.core.xcontent.XContentParser; import org.opensearch.index.IndexNotFoundException; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.util.RestHandlerUtils; import org.opensearch.transport.TransportService; public class DeleteAnomalyDetectorTransportAction extends HandledTransportAction { @@ -80,8 +80,8 @@ public DeleteAnomalyDetectorTransportAction( this.clusterService = clusterService; this.xContentRegistry = xContentRegistry; this.adTaskManager = adTaskManager; - filterByEnabled = AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); + filterByEnabled = AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); } @Override @@ -120,7 +120,8 @@ protected void doExecute(Task task, DeleteAnomalyDetectorRequest request, Action }, listener), client, clusterService, - xContentRegistry + xContentRegistry, + AnomalyDetector.class ); } catch (Exception e) { LOG.error(e); @@ -130,7 +131,7 @@ protected void doExecute(Task task, DeleteAnomalyDetectorRequest request, Action private void deleteAnomalyDetectorJobDoc(String detectorId, ActionListener listener) { LOG.info("Delete anomaly detector job {}", detectorId); - DeleteRequest deleteRequest = new DeleteRequest(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, detectorId) + DeleteRequest deleteRequest = new DeleteRequest(CommonName.JOB_INDEX, detectorId) .setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE); client.delete(deleteRequest, ActionListener.wrap(response -> { if (response.getResult() == DocWriteResponse.Result.DELETED || response.getResult() == DocWriteResponse.Result.NOT_FOUND) { @@ -153,7 +154,7 @@ private void deleteAnomalyDetectorJobDoc(String detectorId, ActionListener listener) { LOG.info("Delete detector info {}", detectorId); - DeleteRequest deleteRequest = new DeleteRequest(CommonName.DETECTION_STATE_INDEX, detectorId); + DeleteRequest deleteRequest = new DeleteRequest(ADCommonName.DETECTION_STATE_INDEX, detectorId); client.delete(deleteRequest, ActionListener.wrap(response -> { // whether deleted state doc or not, continue as state doc may not exist deleteAnomalyDetectorDoc(detectorId, listener); @@ -169,7 +170,7 @@ private void deleteDetectorStateDoc(String detectorId, ActionListener listener) { LOG.info("Delete anomaly detector {}", detectorId); - DeleteRequest deleteRequest = new DeleteRequest(AnomalyDetector.ANOMALY_DETECTORS_INDEX, detectorId) + DeleteRequest deleteRequest = new DeleteRequest(CommonName.CONFIG_INDEX, detectorId) .setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE); client.delete(deleteRequest, new ActionListener() { @Override @@ -184,9 +185,9 @@ public void onFailure(Exception e) { }); } - private void getDetectorJob(String detectorId, ActionListener listener, AnomalyDetectorFunction function) { - if (clusterService.state().metadata().indices().containsKey(ANOMALY_DETECTOR_JOB_INDEX)) { - GetRequest request = new GetRequest(ANOMALY_DETECTOR_JOB_INDEX).id(detectorId); + private void getDetectorJob(String detectorId, ActionListener listener, ExecutorFunction function) { + if (clusterService.state().metadata().indices().containsKey(CommonName.JOB_INDEX)) { + GetRequest request = new GetRequest(CommonName.JOB_INDEX).id(detectorId); client.get(request, ActionListener.wrap(response -> onGetAdJobResponseForWrite(response, listener, function), exception -> { LOG.error("Fail to get anomaly detector job: " + detectorId, exception); listener.onFailure(exception); @@ -196,7 +197,7 @@ private void getDetectorJob(String detectorId, ActionListener li } } - private void onGetAdJobResponseForWrite(GetResponse response, ActionListener listener, AnomalyDetectorFunction function) + private void onGetAdJobResponseForWrite(GetResponse response, ActionListener listener, ExecutorFunction function) throws IOException { if (response.isExists()) { String adJobId = response.getId(); @@ -207,7 +208,7 @@ private void onGetAdJobResponseForWrite(GetResponse response, ActionListener filterEnabled = it); + filterEnabled = AD_FILTER_BY_BACKEND_ROLES.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_FILTER_BY_BACKEND_ROLES, it -> filterEnabled = it); } @Override diff --git a/src/main/java/org/opensearch/ad/transport/DeleteModelRequest.java b/src/main/java/org/opensearch/ad/transport/DeleteModelRequest.java index d769de0d0..9ec58acda 100644 --- a/src/main/java/org/opensearch/ad/transport/DeleteModelRequest.java +++ b/src/main/java/org/opensearch/ad/transport/DeleteModelRequest.java @@ -17,8 +17,8 @@ import org.opensearch.action.ActionRequestValidationException; import org.opensearch.action.support.nodes.BaseNodesRequest; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.core.common.Strings; import org.opensearch.core.common.io.stream.StreamInput; @@ -61,7 +61,7 @@ public void writeTo(StreamOutput out) throws IOException { public ActionRequestValidationException validate() { ActionRequestValidationException validationException = null; if (Strings.isEmpty(adID)) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.AD_ID_MISSING_MSG, validationException); } return validationException; } @@ -69,7 +69,7 @@ public ActionRequestValidationException validate() { @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { builder.startObject(); - builder.field(CommonName.ID_JSON_KEY, adID); + builder.field(ADCommonName.ID_JSON_KEY, adID); builder.endObject(); return builder; } diff --git a/src/main/java/org/opensearch/ad/transport/DeleteModelTransportAction.java b/src/main/java/org/opensearch/ad/transport/DeleteModelTransportAction.java index b8220601c..10aa64725 100644 --- a/src/main/java/org/opensearch/ad/transport/DeleteModelTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/DeleteModelTransportAction.java @@ -19,7 +19,6 @@ import org.opensearch.action.FailedNodeException; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.nodes.TransportNodesAction; -import org.opensearch.ad.NodeStateManager; import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.feature.FeatureManager; import org.opensearch.ad.ml.EntityColdStarter; @@ -30,6 +29,7 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.NodeStateManager; import org.opensearch.transport.TransportService; public class DeleteModelTransportAction extends diff --git a/src/main/java/org/opensearch/ad/transport/EntityProfileRequest.java b/src/main/java/org/opensearch/ad/transport/EntityProfileRequest.java index ec0f9543c..2aba165a7 100644 --- a/src/main/java/org/opensearch/ad/transport/EntityProfileRequest.java +++ b/src/main/java/org/opensearch/ad/transport/EntityProfileRequest.java @@ -19,16 +19,16 @@ import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.model.Entity; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.EntityProfileName; -import org.opensearch.ad.util.Bwc; import org.opensearch.core.common.Strings; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.model.Entity; public class EntityProfileRequest extends ActionRequest implements ToXContentObject { public static final String ENTITY = "entity"; @@ -41,17 +41,8 @@ public class EntityProfileRequest extends ActionRequest implements ToXContentObj public EntityProfileRequest(StreamInput in) throws IOException { super(in); adID = in.readString(); - if (Bwc.supportMultiCategoryFields(in.getVersion())) { - entityValue = new Entity(in); - } else { - // entity profile involving an old node won't work. Read - // EntityProfileTransportAction.doExecute for details. Read - // a string to not cause EOF exception. - // Cannot assign null to entityValue as old node has no logic to - // deal with a null entity. - String oldFormatEntityString = in.readString(); - entityValue = Entity.createSingleAttributeEntity(CommonName.EMPTY_FIELD, oldFormatEntityString); - } + entityValue = new Entity(in); + int size = in.readVInt(); profilesToCollect = new HashSet(); if (size != 0) { @@ -84,14 +75,8 @@ public Set getProfilesToCollect() { public void writeTo(StreamOutput out) throws IOException { super.writeTo(out); out.writeString(adID); - if (Bwc.supportMultiCategoryFields(out.getVersion())) { - entityValue.writeTo(out); - } else { - // entity profile involving an old node won't work. Read - // EntityProfileTransportAction.doExecute for details. Write - // a string to not cause EOF exception. - out.writeString(entityValue.toString()); - } + entityValue.writeTo(out); + out.writeVInt(profilesToCollect.size()); for (EntityProfileName profile : profilesToCollect) { out.writeEnum(profile); @@ -102,13 +87,13 @@ public void writeTo(StreamOutput out) throws IOException { public ActionRequestValidationException validate() { ActionRequestValidationException validationException = null; if (Strings.isEmpty(adID)) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.AD_ID_MISSING_MSG, validationException); } if (entityValue == null) { validationException = addValidationError("Entity value is missing", validationException); } if (profilesToCollect == null || profilesToCollect.isEmpty()) { - validationException = addValidationError(CommonErrorMessages.EMPTY_PROFILES_COLLECT, validationException); + validationException = addValidationError(CommonMessages.EMPTY_PROFILES_COLLECT, validationException); } return validationException; } @@ -116,7 +101,7 @@ public ActionRequestValidationException validate() { @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { builder.startObject(); - builder.field(CommonName.ID_JSON_KEY, adID); + builder.field(ADCommonName.ID_JSON_KEY, adID); builder.field(ENTITY, entityValue); builder.field(PROFILES, profilesToCollect); builder.endObject(); diff --git a/src/main/java/org/opensearch/ad/transport/EntityProfileResponse.java b/src/main/java/org/opensearch/ad/transport/EntityProfileResponse.java index db7642842..1b8b51da2 100644 --- a/src/main/java/org/opensearch/ad/transport/EntityProfileResponse.java +++ b/src/main/java/org/opensearch/ad/transport/EntityProfileResponse.java @@ -17,10 +17,8 @@ import org.apache.commons.lang.builder.EqualsBuilder; import org.apache.commons.lang.builder.HashCodeBuilder; import org.apache.commons.lang.builder.ToStringBuilder; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.model.ModelProfile; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.ModelProfileOnNode; -import org.opensearch.ad.util.Bwc; import org.opensearch.core.action.ActionResponse; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; @@ -82,14 +80,7 @@ public EntityProfileResponse(StreamInput in) throws IOException { lastActiveMs = in.readLong(); totalUpdates = in.readLong(); if (in.readBoolean()) { - if (Bwc.supportMultiCategoryFields(in.getVersion())) { - modelProfile = new ModelProfileOnNode(in); - } else { - // we don't have model information from old node - ModelProfile profile = new ModelProfile(in); - modelProfile = new ModelProfileOnNode("", profile); - } - + modelProfile = new ModelProfileOnNode(in); } else { modelProfile = null; } @@ -118,12 +109,7 @@ public void writeTo(StreamOutput out) throws IOException { out.writeLong(totalUpdates); if (modelProfile != null) { out.writeBoolean(true); - if (Bwc.supportMultiCategoryFields(out.getVersion())) { - modelProfile.writeTo(out); - } else { - ModelProfile oldFormatModelProfile = modelProfile.getModelProfile(); - oldFormatModelProfile.writeTo(out); - } + modelProfile.writeTo(out); } else { out.writeBoolean(false); } @@ -142,7 +128,7 @@ public XContentBuilder toXContent(XContentBuilder builder, Params params) throws builder.field(TOTAL_UPDATES, totalUpdates); } if (modelProfile != null) { - builder.field(CommonName.MODEL, modelProfile); + builder.field(ADCommonName.MODEL, modelProfile); } builder.endObject(); return builder; @@ -154,7 +140,7 @@ public String toString() { builder.append(ACTIVE, isActive); builder.append(LAST_ACTIVE_TS, lastActiveMs); builder.append(TOTAL_UPDATES, totalUpdates); - builder.append(CommonName.MODEL, modelProfile); + builder.append(ADCommonName.MODEL, modelProfile); return builder.toString(); } diff --git a/src/main/java/org/opensearch/ad/transport/EntityProfileTransportAction.java b/src/main/java/org/opensearch/ad/transport/EntityProfileTransportAction.java index 567f2dbd3..fedfb2aa7 100644 --- a/src/main/java/org/opensearch/ad/transport/EntityProfileTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/EntityProfileTransportAction.java @@ -23,8 +23,6 @@ import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.caching.EntityCache; import org.opensearch.ad.cluster.HashRing; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.model.EntityProfileName; import org.opensearch.ad.model.ModelProfile; import org.opensearch.ad.model.ModelProfileOnNode; @@ -37,6 +35,8 @@ import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.tasks.Task; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.model.Entity; import org.opensearch.transport.TransportException; import org.opensearch.transport.TransportRequestOptions; import org.opensearch.transport.TransportResponseHandler; @@ -73,7 +73,7 @@ public EntityProfileTransportAction( this.option = TransportRequestOptions .builder() .withType(TransportRequestOptions.Type.REG) - .withTimeout(AnomalyDetectorSettings.REQUEST_TIMEOUT.get(settings)) + .withTimeout(AnomalyDetectorSettings.AD_REQUEST_TIMEOUT.get(settings)) .build(); this.clusterService = clusterService; this.cacheProvider = cacheProvider; @@ -86,14 +86,14 @@ protected void doExecute(Task task, EntityProfileRequest request, ActionListener Entity entityValue = request.getEntityValue(); Optional modelIdOptional = entityValue.getModelId(adID); if (false == modelIdOptional.isPresent()) { - listener.onFailure(new AnomalyDetectionException(adID, NO_MODEL_ID_FOUND_MSG)); + listener.onFailure(new TimeSeriesException(adID, NO_MODEL_ID_FOUND_MSG)); return; } // we use entity's toString (e.g., app_0) to find its node // This should be consistent with how we land a model node in AnomalyResultTransportAction Optional node = hashRing.getOwningNodeWithSameLocalAdVersionForRealtimeAD(entityValue.toString()); if (false == node.isPresent()) { - listener.onFailure(new AnomalyDetectionException(adID, NO_NODE_FOUND_MSG)); + listener.onFailure(new TimeSeriesException(adID, NO_NODE_FOUND_MSG)); return; } String nodeId = node.get().getId(); @@ -157,7 +157,7 @@ public String executor() { ); } catch (Exception e) { LOG.error(String.format(Locale.ROOT, "Fail to get entity profile for detector {}, entity {}", adID, entityValue), e); - listener.onFailure(new AnomalyDetectionException(adID, FAIL_TO_GET_ENTITY_PROFILE_MSG, e)); + listener.onFailure(new TimeSeriesException(adID, FAIL_TO_GET_ENTITY_PROFILE_MSG, e)); } } else { @@ -174,7 +174,7 @@ public String executor() { adID, entityValue ); - listener.onFailure(new AnomalyDetectionException(adID, FAIL_TO_GET_ENTITY_PROFILE_MSG)); + listener.onFailure(new TimeSeriesException(adID, FAIL_TO_GET_ENTITY_PROFILE_MSG)); } } } diff --git a/src/main/java/org/opensearch/ad/transport/EntityResultRequest.java b/src/main/java/org/opensearch/ad/transport/EntityResultRequest.java index 236d55fc3..91041f447 100644 --- a/src/main/java/org/opensearch/ad/transport/EntityResultRequest.java +++ b/src/main/java/org/opensearch/ad/transport/EntityResultRequest.java @@ -14,7 +14,6 @@ import static org.opensearch.action.ValidateActions.addValidationError; import java.io.IOException; -import java.util.HashMap; import java.util.Locale; import java.util.Map; @@ -22,15 +21,16 @@ import org.apache.logging.log4j.Logger; import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.util.Bwc; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.common.Strings; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Entity; public class EntityResultRequest extends ActionRequest implements ToXContentObject { private static final Logger LOG = LogManager.getLogger(EntityResultRequest.class); @@ -46,21 +46,7 @@ public EntityResultRequest(StreamInput in) throws IOException { // guarded with version check. Just in case we receive requests from older node where we use String // to represent an entity - if (Bwc.supportMultiCategoryFields(in.getVersion())) { - this.entities = in.readMap(Entity::new, StreamInput::readDoubleArray); - } else { - // receive a request from a version before OpenSearch 1.1 - // the old request uses Map instead of Map to represent entities - // since it only supports one categorical field. - Map oldFormatEntities = in.readMap(StreamInput::readString, StreamInput::readDoubleArray); - entities = new HashMap<>(); - for (Map.Entry entry : oldFormatEntities.entrySet()) { - // we don't know the category field name as we don't have access to detector config object - // so we put empty string as the category field name for now. Will handle the case - // in EntityResultTransportAciton. - entities.put(Entity.createSingleAttributeEntity(CommonName.EMPTY_FIELD, entry.getKey()), entry.getValue()); - } - } + this.entities = in.readMap(Entity::new, StreamInput::readDoubleArray); this.start = in.readLong(); this.end = in.readLong(); @@ -74,7 +60,7 @@ public EntityResultRequest(String detectorId, Map entities, lo this.end = end; } - public String getDetectorId() { + public String getId() { return this.detectorId; } @@ -96,25 +82,7 @@ public void writeTo(StreamOutput out) throws IOException { out.writeString(this.detectorId); // guarded with version check. Just in case we send requests to older node where we use String // to represent an entity - if (Bwc.supportMultiCategoryFields(out.getVersion())) { - out.writeMap(entities, (s, e) -> e.writeTo(s), StreamOutput::writeDoubleArray); - } else { - Map oldFormatEntities = new HashMap<>(); - for (Map.Entry entry : entities.entrySet()) { - Map attributes = entry.getKey().getAttributes(); - if (attributes.size() != 1) { - // cannot send a multi-category field entity to old node since it will - // cause EOF exception and stop the detector. The issue - // is temporary and will be gone after upgrade completes. - // Since one EntityResultRequest is sent to one node, we can safely - // ignore the rest of the requests. - LOG.info("Skip sending multi-category entities to an incompatible node. Attributes: ", attributes); - break; - } - oldFormatEntities.put(entry.getKey().getAttributes().entrySet().iterator().next().getValue(), entry.getValue()); - } - out.writeMap(oldFormatEntities, StreamOutput::writeString, StreamOutput::writeDoubleArray); - } + out.writeMap(entities, (s, e) -> e.writeTo(s), StreamOutput::writeDoubleArray); out.writeLong(this.start); out.writeLong(this.end); @@ -124,11 +92,11 @@ public void writeTo(StreamOutput out) throws IOException { public ActionRequestValidationException validate() { ActionRequestValidationException validationException = null; if (Strings.isEmpty(detectorId)) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.AD_ID_MISSING_MSG, validationException); } if (start <= 0 || end <= 0 || start > end) { validationException = addValidationError( - String.format(Locale.ROOT, "%s: start %d, end %d", CommonErrorMessages.INVALID_TIMESTAMP_ERR_MSG, start, end), + String.format(Locale.ROOT, "%s: start %d, end %d", CommonMessages.INVALID_TIMESTAMP_ERR_MSG, start, end), validationException ); } @@ -138,7 +106,7 @@ public ActionRequestValidationException validate() { @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { builder.startObject(); - builder.field(CommonName.ID_JSON_KEY, detectorId); + builder.field(ADCommonName.ID_JSON_KEY, detectorId); builder.field(CommonName.START_JSON_KEY, start); builder.field(CommonName.END_JSON_KEY, end); builder.startArray(CommonName.ENTITIES_JSON_KEY); diff --git a/src/main/java/org/opensearch/ad/transport/EntityResultTransportAction.java b/src/main/java/org/opensearch/ad/transport/EntityResultTransportAction.java index 0a40efaeb..d17ce7137 100644 --- a/src/main/java/org/opensearch/ad/transport/EntityResultTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/EntityResultTransportAction.java @@ -26,23 +26,16 @@ import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; import org.opensearch.action.support.master.AcknowledgedResponse; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.caching.CacheProvider; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.common.exception.LimitExceededException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.indices.ADIndex; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelManager; import org.opensearch.ad.ml.ModelState; import org.opensearch.ad.ml.ThresholdingResult; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.ratelimit.CheckpointReadWorker; import org.opensearch.ad.ratelimit.ColdEntityWorker; import org.opensearch.ad.ratelimit.EntityColdStartWorker; @@ -51,13 +44,22 @@ import org.opensearch.ad.ratelimit.ResultWriteRequest; import org.opensearch.ad.ratelimit.ResultWriteWorker; import org.opensearch.ad.stats.ADStats; -import org.opensearch.ad.stats.StatNames; -import org.opensearch.ad.util.ExceptionUtil; -import org.opensearch.ad.util.ParseUtils; import org.opensearch.common.inject.Inject; import org.opensearch.core.action.ActionListener; import org.opensearch.tasks.Task; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.stats.StatNames; +import org.opensearch.timeseries.util.ExceptionUtil; +import org.opensearch.timeseries.util.ParseUtils; import org.opensearch.transport.TransportService; /** @@ -83,10 +85,10 @@ public class EntityResultTransportAction extends HandledTransportAction listener) { if (adCircuitBreakerService.isOpen()) { - threadPool.executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME).execute(() -> cache.get().releaseMemoryForOpenCircuitBreaker()); - listener - .onFailure(new LimitExceededException(request.getDetectorId(), CommonErrorMessages.MEMORY_CIRCUIT_BROKEN_ERR_MSG, false)); + threadPool + .executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME) + .execute(() -> cache.get().releaseMemoryForOpenCircuitBreaker()); + listener.onFailure(new LimitExceededException(request.getId(), CommonMessages.MEMORY_CIRCUIT_BROKEN_ERR_MSG, false)); return; } try { - String detectorId = request.getDetectorId(); + String detectorId = request.getId(); Optional previousException = stateManager.fetchExceptionAndClear(detectorId); @@ -152,14 +155,14 @@ protected void doExecute(Task task, EntityResultRequest request, ActionListener< listener = ExceptionUtil.wrapListener(listener, exception, detectorId); } - stateManager.getAnomalyDetector(detectorId, onGetDetector(listener, detectorId, request, previousException)); + stateManager.getConfig(detectorId, AnalysisType.AD, onGetDetector(listener, detectorId, request, previousException)); } catch (Exception exception) { LOG.error("fail to get entity's anomaly grade", exception); listener.onFailure(exception); } } - private ActionListener> onGetDetector( + private ActionListener> onGetDetector( ActionListener listener, String detectorId, EntityResultRequest request, @@ -171,7 +174,7 @@ private ActionListener> onGetDetector( return; } - AnomalyDetector detector = detectorOptional.get(); + AnomalyDetector detector = (AnomalyDetector) detectorOptional.get(); if (request.getEntities() == null) { listener.onFailure(new EndRunException(detectorId, "Fail to get any entities from request.", false)); @@ -184,12 +187,12 @@ private ActionListener> onGetDetector( Entity categoricalValues = entityEntry.getKey(); if (isEntityFromOldNodeMsg(categoricalValues) - && detector.getCategoryField() != null - && detector.getCategoryField().size() == 1) { + && detector.getCategoryFields() != null + && detector.getCategoryFields().size() == 1) { Map attrValues = categoricalValues.getAttributes(); // handle a request from a version before OpenSearch 1.1. categoricalValues = Entity - .createSingleAttributeEntity(detector.getCategoryField().get(0), attrValues.get(CommonName.EMPTY_FIELD)); + .createSingleAttributeEntity(detector.getCategoryFields().get(0), attrValues.get(ADCommonName.EMPTY_FIELD)); } Optional modelIdOptional = categoricalValues.getModelId(detectorId); @@ -212,31 +215,32 @@ private ActionListener> onGetDetector( // result.getGrade() = 0 means it is not an anomaly // So many OpenSearchRejectedExecutionException if we write no matter what if (result.getRcfScore() > 0) { - AnomalyResult resultToSave = result - .toAnomalyResult( + List resultsToSave = result + .toIndexableResults( detector, Instant.ofEpochMilli(request.getStart()), Instant.ofEpochMilli(request.getEnd()), executionStartTime, Instant.now(), ParseUtils.getFeatureData(datapoint, detector), - categoricalValues, + Optional.ofNullable(categoricalValues), indexUtil.getSchemaVersion(ADIndex.RESULT), modelId, null, null ); - - resultWriteQueue - .put( - new ResultWriteRequest( - System.currentTimeMillis() + detector.getDetectorIntervalInMilliseconds(), - detectorId, - result.getGrade() > 0 ? RequestPriority.HIGH : RequestPriority.MEDIUM, - resultToSave, - detector.getResultIndex() - ) - ); + for (AnomalyResult r : resultsToSave) { + resultWriteQueue + .put( + new ResultWriteRequest( + System.currentTimeMillis() + detector.getIntervalInMilliseconds(), + detectorId, + result.getGrade() > 0 ? RequestPriority.HIGH : RequestPriority.MEDIUM, + r, + detector.getCustomResultIndex() + ) + ); + } } } catch (IllegalArgumentException e) { // fail to score likely due to model corruption. Re-cold start to recover. @@ -246,7 +250,7 @@ private ActionListener> onGetDetector( entityColdStartWorker .put( new EntityFeatureRequest( - System.currentTimeMillis() + detector.getDetectorIntervalInMilliseconds(), + System.currentTimeMillis() + detector.getIntervalInMilliseconds(), detectorId, RequestPriority.MEDIUM, categoricalValues, @@ -274,7 +278,7 @@ private ActionListener> onGetDetector( hotEntityRequests .add( new EntityFeatureRequest( - System.currentTimeMillis() + detector.getDetectorIntervalInMilliseconds(), + System.currentTimeMillis() + detector.getIntervalInMilliseconds(), detectorId, // hot entities has MEDIUM priority RequestPriority.MEDIUM, @@ -294,7 +298,7 @@ private ActionListener> onGetDetector( coldEntityRequests .add( new EntityFeatureRequest( - System.currentTimeMillis() + detector.getDetectorIntervalInMilliseconds(), + System.currentTimeMillis() + detector.getIntervalInMilliseconds(), detectorId, // cold entities has LOW priority RequestPriority.LOW, @@ -347,6 +351,6 @@ private ActionListener> onGetDetector( */ private boolean isEntityFromOldNodeMsg(Entity categoricalValues) { Map attrValues = categoricalValues.getAttributes(); - return (attrValues != null && attrValues.containsKey(CommonName.EMPTY_FIELD)); + return (attrValues != null && attrValues.containsKey(ADCommonName.EMPTY_FIELD)); } } diff --git a/src/main/java/org/opensearch/ad/transport/ForwardADTaskAction.java b/src/main/java/org/opensearch/ad/transport/ForwardADTaskAction.java index 14a4dbeb5..f63a188cd 100644 --- a/src/main/java/org/opensearch/ad/transport/ForwardADTaskAction.java +++ b/src/main/java/org/opensearch/ad/transport/ForwardADTaskAction.java @@ -11,17 +11,18 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonName.AD_TASK; +import static org.opensearch.ad.constant.ADCommonName.AD_TASK; import org.opensearch.action.ActionType; import org.opensearch.ad.constant.CommonValue; +import org.opensearch.timeseries.transport.JobResponse; -public class ForwardADTaskAction extends ActionType { +public class ForwardADTaskAction extends ActionType { // Internal Action which is not used for public facing RestAPIs. public static final String NAME = CommonValue.INTERNAL_ACTION_PREFIX + "detector/" + AD_TASK + "/forward"; public static final ForwardADTaskAction INSTANCE = new ForwardADTaskAction(); private ForwardADTaskAction() { - super(NAME, AnomalyDetectorJobResponse::new); + super(NAME, JobResponse::new); } } diff --git a/src/main/java/org/opensearch/ad/transport/ForwardADTaskRequest.java b/src/main/java/org/opensearch/ad/transport/ForwardADTaskRequest.java index 98e7d018d..417696609 100644 --- a/src/main/java/org/opensearch/ad/transport/ForwardADTaskRequest.java +++ b/src/main/java/org/opensearch/ad/transport/ForwardADTaskRequest.java @@ -20,23 +20,22 @@ import org.opensearch.Version; import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.cluster.ADVersionUtil; -import org.opensearch.ad.common.exception.ADVersionException; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.ADTaskAction; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.DetectionDateRange; -import org.opensearch.ad.rest.handler.AnomalyDetectorFunction; import org.opensearch.commons.authuser.User; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; +import org.opensearch.timeseries.common.exception.VersionException; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.model.DateRange; import org.opensearch.transport.TransportService; public class ForwardADTaskRequest extends ActionRequest { private AnomalyDetector detector; private ADTask adTask; - private DetectionDateRange detectionDateRange; + private DateRange detectionDateRange; private List staleRunningEntities; private User user; private Integer availableTaskSlots; @@ -47,7 +46,7 @@ public class ForwardADTaskRequest extends ActionRequest { * For most task actions, we only send ForwardADTaskRequest to node with same local AD version. * But it's possible that we need to clean up detector cache by sending FINISHED task action to * an old coordinating node when no task running for the detector. - * Check {@link org.opensearch.ad.task.ADTaskManager#cleanDetectorCache(ADTask, TransportService, AnomalyDetectorFunction)}. + * Check {@link org.opensearch.ad.task.ADTaskManager#cleanDetectorCache(ADTask, TransportService, ExecutorFunction)}. * * @param detector detector * @param detectionDateRange detection date range @@ -58,17 +57,14 @@ public class ForwardADTaskRequest extends ActionRequest { */ public ForwardADTaskRequest( AnomalyDetector detector, - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, User user, ADTaskAction adTaskAction, Integer availableTaskSlots, Version remoteAdVersion ) { - if (!ADVersionUtil.compatibleWithVersionOnOrAfter1_1(remoteAdVersion)) { - throw new ADVersionException( - detector.getDetectorId(), - "Can't forward AD task request to node running AD version " + remoteAdVersion - ); + if (remoteAdVersion == null) { + throw new VersionException(detector.getId(), "Can't forward AD task request to node running null AD version "); } this.detector = detector; this.detectionDateRange = detectionDateRange; @@ -77,7 +73,7 @@ public ForwardADTaskRequest( this.adTaskAction = adTaskAction; } - public ForwardADTaskRequest(AnomalyDetector detector, DetectionDateRange detectionDateRange, User user, ADTaskAction adTaskAction) { + public ForwardADTaskRequest(AnomalyDetector detector, DateRange detectionDateRange, User user, ADTaskAction adTaskAction) { this.detector = detector; this.detectionDateRange = detectionDateRange; this.user = user; @@ -114,13 +110,13 @@ public ForwardADTaskRequest(StreamInput in) throws IOException { // This will reject request from old node running AD version on or before 1.0. // So if coordinating node is old node, it can't use new node as worker node // to run task. - throw new ADVersionException("Can't process ForwardADTaskRequest of old version"); + throw new VersionException("Can't process ForwardADTaskRequest of old version"); } if (in.readBoolean()) { this.adTask = new ADTask(in); } if (in.readBoolean()) { - this.detectionDateRange = new DetectionDateRange(in); + this.detectionDateRange = new DateRange(in); } this.staleRunningEntities = in.readOptionalStringList(); availableTaskSlots = in.readOptionalInt(); @@ -158,15 +154,15 @@ public void writeTo(StreamOutput out) throws IOException { public ActionRequestValidationException validate() { ActionRequestValidationException validationException = null; if (detector == null) { - validationException = addValidationError(CommonErrorMessages.DETECTOR_MISSING, validationException); - } else if (detector.getDetectorId() == null) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.DETECTOR_MISSING, validationException); + } else if (detector.getId() == null) { + validationException = addValidationError(ADCommonMessages.AD_ID_MISSING_MSG, validationException); } if (adTaskAction == null) { - validationException = addValidationError(CommonErrorMessages.AD_TASK_ACTION_MISSING, validationException); + validationException = addValidationError(ADCommonMessages.AD_TASK_ACTION_MISSING, validationException); } if (adTaskAction == ADTaskAction.CLEAN_STALE_RUNNING_ENTITIES && (staleRunningEntities == null || staleRunningEntities.isEmpty())) { - validationException = addValidationError(CommonErrorMessages.EMPTY_STALE_RUNNING_ENTITIES, validationException); + validationException = addValidationError(ADCommonMessages.EMPTY_STALE_RUNNING_ENTITIES, validationException); } return validationException; } @@ -179,7 +175,7 @@ public ADTask getAdTask() { return adTask; } - public DetectionDateRange getDetectionDateRange() { + public DateRange getDetectionDateRange() { return detectionDateRange; } diff --git a/src/main/java/org/opensearch/ad/transport/ForwardADTaskTransportAction.java b/src/main/java/org/opensearch/ad/transport/ForwardADTaskTransportAction.java index f22420b63..d2c571fa8 100644 --- a/src/main/java/org/opensearch/ad/transport/ForwardADTaskTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/ForwardADTaskTransportAction.java @@ -23,13 +23,10 @@ import org.opensearch.OpenSearchStatusException; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; -import org.opensearch.ad.NodeStateManager; import org.opensearch.ad.feature.FeatureManager; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.ADTaskAction; -import org.opensearch.ad.model.ADTaskState; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.DetectionDateRange; import org.opensearch.ad.task.ADTaskCacheManager; import org.opensearch.ad.task.ADTaskManager; import org.opensearch.common.inject.Inject; @@ -37,11 +34,15 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.core.rest.RestStatus; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.TaskState; +import org.opensearch.timeseries.transport.JobResponse; import org.opensearch.transport.TransportService; import com.google.common.collect.ImmutableMap; -public class ForwardADTaskTransportAction extends HandledTransportAction { +public class ForwardADTaskTransportAction extends HandledTransportAction { private final Logger logger = LogManager.getLogger(ForwardADTaskTransportAction.class); private final TransportService transportService; private final ADTaskManager adTaskManager; @@ -75,11 +76,11 @@ public ForwardADTaskTransportAction( } @Override - protected void doExecute(Task task, ForwardADTaskRequest request, ActionListener listener) { + protected void doExecute(Task task, ForwardADTaskRequest request, ActionListener listener) { ADTaskAction adTaskAction = request.getAdTaskAction(); AnomalyDetector detector = request.getDetector(); - DetectionDateRange detectionDateRange = request.getDetectionDateRange(); - String detectorId = detector.getDetectorId(); + DateRange detectionDateRange = request.getDetectionDateRange(); + String detectorId = detector.getId(); ADTask adTask = request.getAdTask(); User user = request.getUser(); Integer availableTaskSlots = request.getAvailableTaskSLots(); @@ -108,20 +109,20 @@ protected void doExecute(Task task, ForwardADTaskRequest request, ActionListener // Start historical analysis for detector logger.debug("Received START action for detector {}", detectorId); adTaskManager.startDetector(detector, detectionDateRange, user, transportService, ActionListener.wrap(r -> { - adTaskCacheManager.setDetectorTaskSlots(detector.getDetectorId(), availableTaskSlots); + adTaskCacheManager.setDetectorTaskSlots(detector.getId(), availableTaskSlots); listener.onResponse(r); }, e -> listener.onFailure(e))); break; case NEXT_ENTITY: logger.debug("Received NEXT_ENTITY action for detector {}, task {}", detectorId, adTask.getTaskId()); // Run next entity for HC detector historical analysis. - if (detector.isMultientityDetector()) { // AD task could be HC detector level task or entity task + if (detector.isHighCardinality()) { // AD task could be HC detector level task or entity task adTaskCacheManager.removeRunningEntity(detectorId, entityValue); if (!adTaskCacheManager.hasEntity(detectorId)) { adTaskCacheManager.setDetectorTaskSlots(detectorId, 0); logger.info("Historical HC detector done, will remove from cache, detector id:{}", detectorId); - listener.onResponse(new AnomalyDetectorJobResponse(detectorId, 0, 0, 0, RestStatus.OK)); - ADTaskState state = !adTask.isEntityTask() && adTask.getError() != null ? ADTaskState.FAILED : ADTaskState.FINISHED; + listener.onResponse(new JobResponse(detectorId)); + TaskState state = !adTask.isEntityTask() && adTask.getError() != null ? TaskState.FAILED : TaskState.FINISHED; adTaskManager.setHCDetectorTaskDone(adTask, state, listener); } else { logger.debug("Run next entity for detector " + detectorId); @@ -133,7 +134,7 @@ protected void doExecute(Task task, ForwardADTaskRequest request, ActionListener ImmutableMap .of( STATE_FIELD, - ADTaskState.RUNNING.name(), + TaskState.RUNNING.name(), TASK_PROGRESS_FIELD, adTaskManager.hcDetectorProgress(detectorId), ERROR_FIELD, @@ -157,18 +158,18 @@ protected void doExecute(Task task, ForwardADTaskRequest request, ActionListener if (adTask.isEntityTask()) { // AD task must be entity level task. adTaskCacheManager.removeRunningEntity(detectorId, entityValue); if (adTaskManager.isRetryableError(adTask.getError()) - && !adTaskCacheManager.exceedRetryLimit(adTask.getDetectorId(), adTask.getTaskId())) { + && !adTaskCacheManager.exceedRetryLimit(adTask.getConfigId(), adTask.getTaskId())) { // If retryable exception happens when run entity task, will push back entity to the end // of pending entities queue, then we can retry it later. - adTaskCacheManager.pushBackEntity(adTask.getTaskId(), adTask.getDetectorId(), entityValue); + adTaskCacheManager.pushBackEntity(adTask.getTaskId(), adTask.getConfigId(), entityValue); } else { // If exception is not retryable or exceeds retry limit, will remove this entity. - adTaskCacheManager.removeEntity(adTask.getDetectorId(), entityValue); + adTaskCacheManager.removeEntity(adTask.getConfigId(), entityValue); logger.warn("Entity task failed, task id: {}, entity: {}", adTask.getTaskId(), adTask.getEntity().toString()); } if (!adTaskCacheManager.hasEntity(detectorId)) { adTaskCacheManager.setDetectorTaskSlots(detectorId, 0); - adTaskManager.setHCDetectorTaskDone(adTask, ADTaskState.FINISHED, listener); + adTaskManager.setHCDetectorTaskDone(adTask, TaskState.FINISHED, listener); } else { logger.debug("scale task slots for PUSH_BACK_ENTITY, detector {} task {}", detectorId, adTask.getTaskId()); int taskSlots = adTaskCacheManager.scaleDownHCDetectorTaskSlots(detectorId, 1); @@ -176,7 +177,7 @@ protected void doExecute(Task task, ForwardADTaskRequest request, ActionListener logger.debug("After scale down, only 1 task slot reserved for detector {}, run next entity", detectorId); adTaskManager.runNextEntityForHCADHistorical(adTask, transportService, listener); } - listener.onResponse(new AnomalyDetectorJobResponse(adTask.getTaskId(), 0, 0, 0, RestStatus.ACCEPTED)); + listener.onResponse(new JobResponse(adTask.getTaskId())); } } else { logger.warn("Can only push back entity task"); @@ -193,20 +194,20 @@ protected void doExecute(Task task, ForwardADTaskRequest request, ActionListener adTaskCacheManager.scaleUpDetectorTaskSlots(detectorId, newSlots); } } - listener.onResponse(new AnomalyDetectorJobResponse(detector.getDetectorId(), 0, 0, 0, RestStatus.OK)); + listener.onResponse(new JobResponse(detector.getId())); break; case CANCEL: logger.debug("Received CANCEL action for detector {}", detectorId); // Cancel HC detector's historical analysis. // Don't support single detector for this action as single entity task will be cancelled directly // on worker node. - if (detector.isMultientityDetector()) { + if (detector.isHighCardinality()) { adTaskCacheManager.clearPendingEntities(detectorId); adTaskCacheManager.removeRunningEntity(detectorId, entityValue); if (!adTaskCacheManager.hasEntity(detectorId) || !adTask.isEntityTask()) { - adTaskManager.setHCDetectorTaskDone(adTask, ADTaskState.STOPPED, listener); + adTaskManager.setHCDetectorTaskDone(adTask, TaskState.STOPPED, listener); } - listener.onResponse(new AnomalyDetectorJobResponse(adTask.getTaskId(), 0, 0, 0, RestStatus.OK)); + listener.onResponse(new JobResponse(adTask.getTaskId())); } else { listener.onFailure(new IllegalArgumentException("Only support cancel HC now")); } @@ -227,7 +228,7 @@ protected void doExecute(Task task, ForwardADTaskRequest request, ActionListener for (String entity : staleRunningEntities) { adTaskManager.removeStaleRunningEntity(adTask, entity, transportService, listener); } - listener.onResponse(new AnomalyDetectorJobResponse(adTask.getTaskId(), 0, 0, 0, RestStatus.OK)); + listener.onResponse(new JobResponse(adTask.getTaskId())); break; case CLEAN_CACHE: boolean historicalTask = adTask.isHistoricalTask(); @@ -249,7 +250,7 @@ protected void doExecute(Task task, ForwardADTaskRequest request, ActionListener stateManager.clear(detectorId); featureManager.clear(detectorId); } - listener.onResponse(new AnomalyDetectorJobResponse(detector.getDetectorId(), 0, 0, 0, RestStatus.OK)); + listener.onResponse(new JobResponse(detector.getId())); break; default: listener.onFailure(new OpenSearchStatusException("Unsupported AD task action " + adTaskAction, RestStatus.BAD_REQUEST)); diff --git a/src/main/java/org/opensearch/ad/transport/GetAnomalyDetectorRequest.java b/src/main/java/org/opensearch/ad/transport/GetAnomalyDetectorRequest.java index 3289ac541..83358bc9d 100644 --- a/src/main/java/org/opensearch/ad/transport/GetAnomalyDetectorRequest.java +++ b/src/main/java/org/opensearch/ad/transport/GetAnomalyDetectorRequest.java @@ -17,11 +17,11 @@ import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.model.Entity; import org.opensearch.core.common.io.stream.InputStreamStreamInput; import org.opensearch.core.common.io.stream.OutputStreamStreamOutput; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; +import org.opensearch.timeseries.model.Entity; public class GetAnomalyDetectorRequest extends ActionRequest { diff --git a/src/main/java/org/opensearch/ad/transport/GetAnomalyDetectorResponse.java b/src/main/java/org/opensearch/ad/transport/GetAnomalyDetectorResponse.java index 84ad4659e..5db241377 100644 --- a/src/main/java/org/opensearch/ad/transport/GetAnomalyDetectorResponse.java +++ b/src/main/java/org/opensearch/ad/transport/GetAnomalyDetectorResponse.java @@ -18,10 +18,8 @@ import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.DetectorProfile; import org.opensearch.ad.model.EntityProfile; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.core.action.ActionResponse; import org.opensearch.core.common.io.stream.InputStreamStreamInput; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; @@ -32,6 +30,8 @@ import org.opensearch.core.rest.RestStatus; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.util.RestHandlerUtils; public class GetAnomalyDetectorResponse extends ActionResponse implements ToXContentObject { public static final String DETECTOR_PROFILE = "detectorProfile"; @@ -41,7 +41,7 @@ public class GetAnomalyDetectorResponse extends ActionResponse implements ToXCon private long primaryTerm; private long seqNo; private AnomalyDetector detector; - private AnomalyDetectorJob adJob; + private Job adJob; private ADTask realtimeAdTask; private ADTask historicalAdTask; private RestStatus restStatus; @@ -72,7 +72,7 @@ public GetAnomalyDetectorResponse(StreamInput in) throws IOException { detector = new AnomalyDetector(in); returnJob = in.readBoolean(); if (returnJob) { - adJob = new AnomalyDetectorJob(in); + adJob = new Job(in); } else { adJob = null; } @@ -96,7 +96,7 @@ public GetAnomalyDetectorResponse( long primaryTerm, long seqNo, AnomalyDetector detector, - AnomalyDetectorJob adJob, + Job adJob, boolean returnJob, ADTask realtimeAdTask, ADTask historicalAdTask, @@ -204,7 +204,7 @@ public DetectorProfile getDetectorProfile() { return detectorProfile; } - public AnomalyDetectorJob getAdJob() { + public Job getAdJob() { return adJob; } diff --git a/src/main/java/org/opensearch/ad/transport/GetAnomalyDetectorTransportAction.java b/src/main/java/org/opensearch/ad/transport/GetAnomalyDetectorTransportAction.java index d616fa204..bdc4460f2 100644 --- a/src/main/java/org/opensearch/ad/transport/GetAnomalyDetectorTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/GetAnomalyDetectorTransportAction.java @@ -11,17 +11,15 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_FIND_DETECTOR_MSG; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_GET_DETECTOR; +import static org.opensearch.ad.constant.ADCommonMessages.FAIL_TO_GET_DETECTOR; import static org.opensearch.ad.model.ADTaskType.ALL_DETECTOR_TASK_TYPES; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES; -import static org.opensearch.ad.util.ParseUtils.getUserContext; -import static org.opensearch.ad.util.ParseUtils.resolveUserAndExecute; -import static org.opensearch.ad.util.RestHandlerUtils.PROFILE; -import static org.opensearch.ad.util.RestHandlerUtils.wrapRestActionListener; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.constant.CommonMessages.FAIL_TO_FIND_CONFIG_MSG; +import static org.opensearch.timeseries.util.ParseUtils.getUserContext; +import static org.opensearch.timeseries.util.ParseUtils.resolveUserAndExecute; +import static org.opensearch.timeseries.util.RestHandlerUtils.PROFILE; +import static org.opensearch.timeseries.util.RestHandlerUtils.wrapRestActionListener; import java.util.ArrayList; import java.util.Arrays; @@ -45,20 +43,14 @@ import org.opensearch.action.support.HandledTransportAction; import org.opensearch.ad.AnomalyDetectorProfileRunner; import org.opensearch.ad.EntityProfileRunner; -import org.opensearch.ad.Name; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.ADTaskType; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.DetectorProfile; import org.opensearch.ad.model.DetectorProfileName; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.model.EntityProfileName; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.task.ADTaskManager; -import org.opensearch.ad.util.DiscoveryNodeFilterer; -import org.opensearch.ad.util.RestHandlerUtils; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.CheckedConsumer; @@ -72,6 +64,14 @@ import org.opensearch.core.xcontent.NamedXContentRegistry; import org.opensearch.core.xcontent.XContentParser; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.Name; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; +import org.opensearch.timeseries.util.RestHandlerUtils; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.TransportService; import com.google.common.collect.Sets; @@ -125,8 +125,8 @@ public GetAnomalyDetectorTransportAction( this.xContentRegistry = xContentRegistry; this.nodeFilter = nodeFilter; - filterByEnabled = AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); + filterByEnabled = AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); this.transportService = transportService; this.adTaskManager = adTaskManager; } @@ -147,7 +147,8 @@ protected void doExecute(Task task, ActionRequest actionRequest, ActionListener< (anomalyDetector) -> getExecute(request, listener), client, clusterService, - xContentRegistry + xContentRegistry, + AnomalyDetector.class ); } catch (Exception e) { LOG.error(e); @@ -172,7 +173,7 @@ protected void getExecute(GetAnomalyDetectorRequest request, ActionListener { listener @@ -202,7 +203,7 @@ protected void getExecute(GetAnomalyDetectorRequest request, ActionListener historicalAdTask, ActionListener listener ) { - MultiGetRequest.Item adItem = new MultiGetRequest.Item(ANOMALY_DETECTORS_INDEX, detectorID); + MultiGetRequest.Item adItem = new MultiGetRequest.Item(CommonName.CONFIG_INDEX, detectorID); MultiGetRequest multiGetRequest = new MultiGetRequest().add(adItem); if (returnJob) { - MultiGetRequest.Item adJobItem = new MultiGetRequest.Item(ANOMALY_DETECTOR_JOB_INDEX, detectorID); + MultiGetRequest.Item adJobItem = new MultiGetRequest.Item(CommonName.JOB_INDEX, detectorID); multiGetRequest.add(adJobItem); } client.multiGet(multiGetRequest, onMultiGetResponse(listener, returnJob, returnTask, realtimeAdTask, historicalAdTask, detectorID)); @@ -290,16 +291,16 @@ private ActionListener onMultiGetResponse( public void onResponse(MultiGetResponse multiGetResponse) { MultiGetItemResponse[] responses = multiGetResponse.getResponses(); AnomalyDetector detector = null; - AnomalyDetectorJob adJob = null; + Job adJob = null; String id = null; long version = 0; long seqNo = 0; long primaryTerm = 0; for (MultiGetItemResponse response : responses) { - if (ANOMALY_DETECTORS_INDEX.equals(response.getIndex())) { + if (CommonName.CONFIG_INDEX.equals(response.getIndex())) { if (response.getResponse() == null || !response.getResponse().isExists()) { - listener.onFailure(new OpenSearchStatusException(FAIL_TO_FIND_DETECTOR_MSG + detectorId, RestStatus.NOT_FOUND)); + listener.onFailure(new OpenSearchStatusException(FAIL_TO_FIND_CONFIG_MSG + detectorId, RestStatus.NOT_FOUND)); return; } id = response.getId(); @@ -321,7 +322,7 @@ public void onResponse(MultiGetResponse multiGetResponse) { } } - if (ANOMALY_DETECTOR_JOB_INDEX.equals(response.getIndex())) { + if (CommonName.JOB_INDEX.equals(response.getIndex())) { if (response.getResponse() != null && response.getResponse().isExists() && !response.getResponse().isSourceEmpty()) { @@ -330,7 +331,7 @@ public void onResponse(MultiGetResponse multiGetResponse) { .createXContentParserFromRegistry(xContentRegistry, response.getResponse().getSourceAsBytesRef()) ) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - adJob = AnomalyDetectorJob.parse(parser); + adJob = Job.parse(parser); } catch (Exception e) { String message = "Failed to parse detector job " + detectorId; listener.onFailure(buildInternalServerErrorResponse(e, message)); diff --git a/src/main/java/org/opensearch/ad/transport/IndexAnomalyDetectorResponse.java b/src/main/java/org/opensearch/ad/transport/IndexAnomalyDetectorResponse.java index e6adbc0e2..661f16285 100644 --- a/src/main/java/org/opensearch/ad/transport/IndexAnomalyDetectorResponse.java +++ b/src/main/java/org/opensearch/ad/transport/IndexAnomalyDetectorResponse.java @@ -14,13 +14,13 @@ import java.io.IOException; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.core.action.ActionResponse; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.rest.RestStatus; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.util.RestHandlerUtils; public class IndexAnomalyDetectorResponse extends ActionResponse implements ToXContentObject { private final String id; diff --git a/src/main/java/org/opensearch/ad/transport/IndexAnomalyDetectorTransportAction.java b/src/main/java/org/opensearch/ad/transport/IndexAnomalyDetectorTransportAction.java index 55f969f28..ac0e560b1 100644 --- a/src/main/java/org/opensearch/ad/transport/IndexAnomalyDetectorTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/IndexAnomalyDetectorTransportAction.java @@ -11,13 +11,13 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_CREATE_DETECTOR; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_UPDATE_DETECTOR; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES; -import static org.opensearch.ad.util.ParseUtils.checkFilterByBackendRoles; -import static org.opensearch.ad.util.ParseUtils.getDetector; -import static org.opensearch.ad.util.ParseUtils.getUserContext; -import static org.opensearch.ad.util.RestHandlerUtils.wrapRestActionListener; +import static org.opensearch.ad.constant.ADCommonMessages.FAIL_TO_CREATE_DETECTOR; +import static org.opensearch.ad.constant.ADCommonMessages.FAIL_TO_UPDATE_DETECTOR; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES; +import static org.opensearch.timeseries.util.ParseUtils.checkFilterByBackendRoles; +import static org.opensearch.timeseries.util.ParseUtils.getConfig; +import static org.opensearch.timeseries.util.ParseUtils.getUserContext; +import static org.opensearch.timeseries.util.RestHandlerUtils.wrapRestActionListener; import java.util.List; import java.util.function.Consumer; @@ -28,14 +28,11 @@ import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; import org.opensearch.action.support.WriteRequest; -import org.opensearch.ad.feature.SearchFeatureDao; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.rest.handler.AnomalyDetectorFunction; import org.opensearch.ad.rest.handler.IndexAnomalyDetectorActionHandler; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.task.ADTaskManager; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.inject.Inject; @@ -49,6 +46,9 @@ import org.opensearch.rest.RestRequest; import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.TransportService; public class IndexAnomalyDetectorTransportAction extends HandledTransportAction { @@ -56,7 +56,7 @@ public class IndexAnomalyDetectorTransportAction extends HandledTransportAction< private final Client client; private final SecurityClientUtil clientUtil; private final TransportService transportService; - private final AnomalyDetectionIndices anomalyDetectionIndices; + private final ADIndexManagement anomalyDetectionIndices; private final ClusterService clusterService; private final NamedXContentRegistry xContentRegistry; private final ADTaskManager adTaskManager; @@ -72,7 +72,7 @@ public IndexAnomalyDetectorTransportAction( SecurityClientUtil clientUtil, ClusterService clusterService, Settings settings, - AnomalyDetectionIndices anomalyDetectionIndices, + ADIndexManagement anomalyDetectionIndices, NamedXContentRegistry xContentRegistry, ADTaskManager adTaskManager, SearchFeatureDao searchFeatureDao @@ -86,8 +86,8 @@ public IndexAnomalyDetectorTransportAction( this.xContentRegistry = xContentRegistry; this.adTaskManager = adTaskManager; this.searchFeatureDao = searchFeatureDao; - filterByEnabled = AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); + filterByEnabled = AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); this.settings = settings; } @@ -126,7 +126,17 @@ private void resolveUserAndExecute( boolean filterByBackendRole = requestedUser == null ? false : filterByEnabled; // Update detector request, check if user has permissions to update the detector // Get detector and verify backend roles - getDetector(requestedUser, detectorId, listener, function, client, clusterService, xContentRegistry, filterByBackendRole); + getConfig( + requestedUser, + detectorId, + listener, + function, + client, + clusterService, + xContentRegistry, + filterByBackendRole, + AnomalyDetector.class + ); } else { // Create Detector. No need to get current detector. function.accept(null); @@ -189,7 +199,7 @@ protected void adExecute( private void checkIndicesAndExecute( List indices, - AnomalyDetectorFunction function, + ExecutorFunction function, ActionListener listener ) { SearchRequest searchRequest = new SearchRequest() diff --git a/src/main/java/org/opensearch/ad/transport/PreviewAnomalyDetectorRequest.java b/src/main/java/org/opensearch/ad/transport/PreviewAnomalyDetectorRequest.java index 5e75c413b..11fa848f7 100644 --- a/src/main/java/org/opensearch/ad/transport/PreviewAnomalyDetectorRequest.java +++ b/src/main/java/org/opensearch/ad/transport/PreviewAnomalyDetectorRequest.java @@ -48,7 +48,7 @@ public AnomalyDetector getDetector() { return detector; } - public String getDetectorId() { + public String getId() { return detectorId; } diff --git a/src/main/java/org/opensearch/ad/transport/PreviewAnomalyDetectorTransportAction.java b/src/main/java/org/opensearch/ad/transport/PreviewAnomalyDetectorTransportAction.java index 2cce23643..5f6c6c9d3 100644 --- a/src/main/java/org/opensearch/ad/transport/PreviewAnomalyDetectorTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/PreviewAnomalyDetectorTransportAction.java @@ -11,14 +11,14 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_PREVIEW_DETECTOR; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES; +import static org.opensearch.ad.constant.ADCommonMessages.FAIL_TO_PREVIEW_DETECTOR; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_ANOMALY_FEATURES; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_CONCURRENT_PREVIEW; -import static org.opensearch.ad.util.ParseUtils.getUserContext; -import static org.opensearch.ad.util.ParseUtils.resolveUserAndExecute; -import static org.opensearch.ad.util.RestHandlerUtils.wrapRestActionListener; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.util.ParseUtils.getUserContext; +import static org.opensearch.timeseries.util.ParseUtils.resolveUserAndExecute; +import static org.opensearch.timeseries.util.RestHandlerUtils.wrapRestActionListener; import java.io.IOException; import java.time.Instant; @@ -34,15 +34,10 @@ import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; import org.opensearch.ad.AnomalyDetectorRunner; -import org.opensearch.ad.breaker.ADCircuitBreakerService; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.ClientException; -import org.opensearch.ad.common.exception.LimitExceededException; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.CheckedConsumer; @@ -55,6 +50,13 @@ import org.opensearch.core.xcontent.NamedXContentRegistry; import org.opensearch.core.xcontent.XContentParser; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.ClientException; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.util.RestHandlerUtils; import org.opensearch.transport.TransportService; public class PreviewAnomalyDetectorTransportAction extends @@ -66,7 +68,7 @@ public class PreviewAnomalyDetectorTransportAction extends private final NamedXContentRegistry xContentRegistry; private volatile Integer maxAnomalyFeatures; private volatile Boolean filterByEnabled; - private final ADCircuitBreakerService adCircuitBreakerService; + private final CircuitBreakerService adCircuitBreakerService; private Semaphore lock; @Inject @@ -78,7 +80,7 @@ public PreviewAnomalyDetectorTransportAction( Client client, AnomalyDetectorRunner anomalyDetectorRunner, NamedXContentRegistry xContentRegistry, - ADCircuitBreakerService adCircuitBreakerService + CircuitBreakerService adCircuitBreakerService ) { super(PreviewAnomalyDetectorAction.NAME, transportService, actionFilters, PreviewAnomalyDetectorRequest::new); this.clusterService = clusterService; @@ -87,8 +89,8 @@ public PreviewAnomalyDetectorTransportAction( this.xContentRegistry = xContentRegistry; maxAnomalyFeatures = MAX_ANOMALY_FEATURES.get(settings); clusterService.getClusterSettings().addSettingsUpdateConsumer(MAX_ANOMALY_FEATURES, it -> maxAnomalyFeatures = it); - filterByEnabled = AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); + filterByEnabled = AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); this.adCircuitBreakerService = adCircuitBreakerService; this.lock = new Semaphore(MAX_CONCURRENT_PREVIEW.get(settings), true); clusterService.getClusterSettings().addSettingsUpdateConsumer(MAX_CONCURRENT_PREVIEW, it -> { lock = new Semaphore(it); }); @@ -100,7 +102,7 @@ protected void doExecute( PreviewAnomalyDetectorRequest request, ActionListener actionListener ) { - String detectorId = request.getDetectorId(); + String detectorId = request.getId(); User user = getUserContext(client); ActionListener listener = wrapRestActionListener(actionListener, FAIL_TO_PREVIEW_DETECTOR); try (ThreadContext.StoredContext context = client.threadPool().getThreadContext().stashContext()) { @@ -112,7 +114,8 @@ protected void doExecute( (anomalyDetector) -> previewExecute(request, context, listener), client, clusterService, - xContentRegistry + xContentRegistry, + AnomalyDetector.class ); } catch (Exception e) { logger.error(e); @@ -126,19 +129,18 @@ void previewExecute( ActionListener listener ) { if (adCircuitBreakerService.isOpen()) { - listener - .onFailure(new LimitExceededException(request.getDetectorId(), CommonErrorMessages.MEMORY_CIRCUIT_BROKEN_ERR_MSG, false)); + listener.onFailure(new LimitExceededException(request.getId(), CommonMessages.MEMORY_CIRCUIT_BROKEN_ERR_MSG, false)); return; } try { if (!lock.tryAcquire()) { - listener.onFailure(new ClientException(request.getDetectorId(), CommonErrorMessages.REQUEST_THROTTLED_MSG)); + listener.onFailure(new ClientException(request.getId(), ADCommonMessages.REQUEST_THROTTLED_MSG)); return; } try { AnomalyDetector detector = request.getDetector(); - String detectorId = request.getDetectorId(); + String detectorId = request.getId(); Instant startTime = request.getStartTime(); Instant endTime = request.getEndTime(); ActionListener releaseListener = ActionListener.runAfter(listener, () -> lock.release()); @@ -174,7 +176,7 @@ private String validateDetector(AnomalyDetector detector) { if (detector.getFeatureAttributes().isEmpty()) { return "Can't preview detector without feature"; } else { - return RestHandlerUtils.checkAnomalyDetectorFeaturesSyntax(detector, maxAnomalyFeatures); + return RestHandlerUtils.checkFeaturesSyntax(detector, maxAnomalyFeatures); } } @@ -189,11 +191,11 @@ public void accept(List anomalyResult) throws Exception { listener.onResponse(response); } }, exception -> { - logger.error("Unexpected error running anomaly detector " + detector.getDetectorId(), exception); + logger.error("Unexpected error running anomaly detector " + detector.getId(), exception); listener .onFailure( new OpenSearchStatusException( - "Unexpected error running anomaly detector " + detector.getDetectorId() + ". " + exception.getMessage(), + "Unexpected error running anomaly detector " + detector.getId() + ". " + exception.getMessage(), RestStatus.INTERNAL_SERVER_ERROR ) ); @@ -209,7 +211,7 @@ private void previewAnomalyDetector( ThreadContext.StoredContext context ) throws IOException { if (!StringUtils.isBlank(detectorId)) { - GetRequest getRequest = new GetRequest(AnomalyDetector.ANOMALY_DETECTORS_INDEX).id(detectorId); + GetRequest getRequest = new GetRequest(CommonName.CONFIG_INDEX).id(detectorId); client.get(getRequest, onGetAnomalyDetectorResponse(listener, startTime, endTime, context)); } else { anomalyDetectorRunner @@ -246,6 +248,6 @@ public void accept(GetResponse response) throws Exception { listener.onFailure(e); } } - }, exception -> { listener.onFailure(new AnomalyDetectionException("Could not execute get query to find detector")); }); + }, exception -> { listener.onFailure(new TimeSeriesException("Could not execute get query to find detector")); }); } } diff --git a/src/main/java/org/opensearch/ad/transport/ProfileNodeRequest.java b/src/main/java/org/opensearch/ad/transport/ProfileNodeRequest.java index d2c2b19bd..1aeaba8f3 100644 --- a/src/main/java/org/opensearch/ad/transport/ProfileNodeRequest.java +++ b/src/main/java/org/opensearch/ad/transport/ProfileNodeRequest.java @@ -39,8 +39,8 @@ public ProfileNodeRequest(ProfileRequest request) { this.request = request; } - public String getDetectorId() { - return request.getDetectorId(); + public String getId() { + return request.getId(); } /** diff --git a/src/main/java/org/opensearch/ad/transport/ProfileNodeResponse.java b/src/main/java/org/opensearch/ad/transport/ProfileNodeResponse.java index 2127c071c..9517f6add 100644 --- a/src/main/java/org/opensearch/ad/transport/ProfileNodeResponse.java +++ b/src/main/java/org/opensearch/ad/transport/ProfileNodeResponse.java @@ -16,14 +16,14 @@ import java.util.Map; import org.opensearch.action.support.nodes.BaseNodeResponse; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.ModelProfile; -import org.opensearch.ad.util.Bwc; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.xcontent.ToXContentFragment; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.constant.CommonName; /** * Profile response on a node @@ -51,7 +51,7 @@ public ProfileNodeResponse(StreamInput in) throws IOException { shingleSize = in.readInt(); activeEntities = in.readVLong(); totalUpdates = in.readVLong(); - if (Bwc.supportMultiCategoryFields(in.getVersion()) && in.readBoolean()) { + if (in.readBoolean()) { // added after OpenSearch 1.0 modelProfiles = in.readList(ModelProfile::new); modelCount = in.readVLong(); @@ -111,15 +111,13 @@ public void writeTo(StreamOutput out) throws IOException { out.writeInt(shingleSize); out.writeVLong(activeEntities); out.writeVLong(totalUpdates); - if (Bwc.supportMultiCategoryFields(out.getVersion())) { - // added after OpenSearch 1.0 - if (modelProfiles != null) { - out.writeBoolean(true); - out.writeList(modelProfiles); - out.writeVLong(modelCount); - } else { - out.writeBoolean(false); - } + // added after OpenSearch 1.0 + if (modelProfiles != null) { + out.writeBoolean(true); + out.writeList(modelProfiles); + out.writeVLong(modelCount); + } else { + out.writeBoolean(false); } } @@ -139,12 +137,12 @@ public XContentBuilder toXContent(XContentBuilder builder, Params params) throws } builder.endObject(); - builder.field(CommonName.SHINGLE_SIZE, shingleSize); - builder.field(CommonName.ACTIVE_ENTITIES, activeEntities); - builder.field(CommonName.TOTAL_UPDATES, totalUpdates); + builder.field(ADCommonName.SHINGLE_SIZE, shingleSize); + builder.field(ADCommonName.ACTIVE_ENTITIES, activeEntities); + builder.field(ADCommonName.TOTAL_UPDATES, totalUpdates); - builder.field(CommonName.MODEL_COUNT, modelCount); - builder.startArray(CommonName.MODELS); + builder.field(ADCommonName.MODEL_COUNT, modelCount); + builder.startArray(ADCommonName.MODELS); for (ModelProfile modelProfile : modelProfiles) { builder.startObject(); modelProfile.toXContent(builder, params); diff --git a/src/main/java/org/opensearch/ad/transport/ProfileRequest.java b/src/main/java/org/opensearch/ad/transport/ProfileRequest.java index 289e8932f..ea779e733 100644 --- a/src/main/java/org/opensearch/ad/transport/ProfileRequest.java +++ b/src/main/java/org/opensearch/ad/transport/ProfileRequest.java @@ -74,7 +74,7 @@ public void writeTo(StreamOutput out) throws IOException { out.writeBoolean(forMultiEntityDetector); } - public String getDetectorId() { + public String getId() { return detectorId; } diff --git a/src/main/java/org/opensearch/ad/transport/ProfileResponse.java b/src/main/java/org/opensearch/ad/transport/ProfileResponse.java index d0277993d..11ba28163 100644 --- a/src/main/java/org/opensearch/ad/transport/ProfileResponse.java +++ b/src/main/java/org/opensearch/ad/transport/ProfileResponse.java @@ -20,10 +20,9 @@ import org.apache.logging.log4j.Logger; import org.opensearch.action.FailedNodeException; import org.opensearch.action.support.nodes.BaseNodesResponse; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.ModelProfile; import org.opensearch.ad.model.ModelProfileOnNode; -import org.opensearch.ad.util.Bwc; import org.opensearch.cluster.ClusterName; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; @@ -36,13 +35,13 @@ public class ProfileResponse extends BaseNodesResponse implements ToXContentFragment { private static final Logger LOG = LogManager.getLogger(ProfileResponse.class); // filed name in toXContent - static final String COORDINATING_NODE = CommonName.COORDINATING_NODE; - static final String SHINGLE_SIZE = CommonName.SHINGLE_SIZE; - static final String TOTAL_SIZE = CommonName.TOTAL_SIZE_IN_BYTES; - static final String ACTIVE_ENTITY = CommonName.ACTIVE_ENTITIES; - static final String MODELS = CommonName.MODELS; - static final String TOTAL_UPDATES = CommonName.TOTAL_UPDATES; - static final String MODEL_COUNT = CommonName.MODEL_COUNT; + static final String COORDINATING_NODE = ADCommonName.COORDINATING_NODE; + static final String SHINGLE_SIZE = ADCommonName.SHINGLE_SIZE; + static final String TOTAL_SIZE = ADCommonName.TOTAL_SIZE_IN_BYTES; + static final String ACTIVE_ENTITY = ADCommonName.ACTIVE_ENTITIES; + static final String MODELS = ADCommonName.MODELS; + static final String TOTAL_UPDATES = ADCommonName.TOTAL_UPDATES; + static final String MODEL_COUNT = ADCommonName.MODEL_COUNT; // changed from ModelProfile to ModelProfileOnNode since Opensearch 1.1 private ModelProfileOnNode[] modelProfile; @@ -65,13 +64,7 @@ public ProfileResponse(StreamInput in) throws IOException { int size = in.readVInt(); modelProfile = new ModelProfileOnNode[size]; for (int i = 0; i < size; i++) { - if (Bwc.supportMultiCategoryFields(in.getVersion())) { - modelProfile[i] = new ModelProfileOnNode(in); - } else { - // we don't have model information from old node - ModelProfile profile = new ModelProfile(in); - modelProfile[i] = new ModelProfileOnNode(CommonName.EMPTY_FIELD, profile); - } + modelProfile[i] = new ModelProfileOnNode(in); } shingleSize = in.readInt(); @@ -79,9 +72,7 @@ public ProfileResponse(StreamInput in) throws IOException { totalSizeInBytes = in.readVLong(); activeEntities = in.readVLong(); totalUpdates = in.readVLong(); - if (Bwc.supportMultiCategoryFields(in.getVersion())) { - modelCount = in.readVLong(); - } + modelCount = in.readVLong(); } /** @@ -140,15 +131,8 @@ public void writeTo(StreamOutput out) throws IOException { super.writeTo(out); out.writeVInt(modelProfile.length); - if (Bwc.supportMultiCategoryFields(out.getVersion())) { - for (ModelProfileOnNode profile : modelProfile) { - profile.writeTo(out); - } - } else { - for (ModelProfileOnNode profile : modelProfile) { - ModelProfile oldFormatModelProfile = profile.getModelProfile(); - oldFormatModelProfile.writeTo(out); - } + for (ModelProfileOnNode profile : modelProfile) { + profile.writeTo(out); } out.writeInt(shingleSize); @@ -156,9 +140,7 @@ public void writeTo(StreamOutput out) throws IOException { out.writeVLong(totalSizeInBytes); out.writeVLong(activeEntities); out.writeVLong(totalUpdates); - if (Bwc.supportMultiCategoryFields(out.getVersion())) { - out.writeVLong(modelCount); - } + out.writeVLong(modelCount); } @Override diff --git a/src/main/java/org/opensearch/ad/transport/ProfileTransportAction.java b/src/main/java/org/opensearch/ad/transport/ProfileTransportAction.java index ec1d16ac3..af1bbed50 100644 --- a/src/main/java/org/opensearch/ad/transport/ProfileTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/ProfileTransportAction.java @@ -11,7 +11,7 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_MODEL_SIZE_PER_NODE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_MAX_MODEL_SIZE_PER_NODE; import java.io.IOException; import java.util.List; @@ -83,8 +83,8 @@ public ProfileTransportAction( this.modelManager = modelManager; this.featureManager = featureManager; this.cacheProvider = cacheProvider; - this.numModelsToReturn = MAX_MODEL_SIZE_PER_NODE.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(MAX_MODEL_SIZE_PER_NODE, it -> this.numModelsToReturn = it); + this.numModelsToReturn = AD_MAX_MODEL_SIZE_PER_NODE.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_MAX_MODEL_SIZE_PER_NODE, it -> this.numModelsToReturn = it); } @Override @@ -104,7 +104,7 @@ protected ProfileNodeResponse newNodeResponse(StreamInput in) throws IOException @Override protected ProfileNodeResponse nodeOperation(ProfileNodeRequest request) { - String detectorId = request.getDetectorId(); + String detectorId = request.getId(); Set profiles = request.getProfilesToBeRetrieved(); int shingleSize = -1; long activeEntity = 0; diff --git a/src/main/java/org/opensearch/ad/transport/RCFPollingRequest.java b/src/main/java/org/opensearch/ad/transport/RCFPollingRequest.java index 9db19ff76..fdf2055cf 100644 --- a/src/main/java/org/opensearch/ad/transport/RCFPollingRequest.java +++ b/src/main/java/org/opensearch/ad/transport/RCFPollingRequest.java @@ -17,8 +17,8 @@ import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.common.Strings; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; @@ -52,7 +52,7 @@ public void writeTo(StreamOutput out) throws IOException { public ActionRequestValidationException validate() { ActionRequestValidationException validationException = null; if (Strings.isEmpty(adID)) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.AD_ID_MISSING_MSG, validationException); } return validationException; } @@ -60,7 +60,7 @@ public ActionRequestValidationException validate() { @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { builder.startObject(); - builder.field(CommonName.ID_JSON_KEY, adID); + builder.field(ADCommonName.ID_JSON_KEY, adID); builder.endObject(); return builder; } diff --git a/src/main/java/org/opensearch/ad/transport/RCFPollingTransportAction.java b/src/main/java/org/opensearch/ad/transport/RCFPollingTransportAction.java index 0e7ea0dcf..a8bd64603 100644 --- a/src/main/java/org/opensearch/ad/transport/RCFPollingTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/RCFPollingTransportAction.java @@ -20,9 +20,7 @@ import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; import org.opensearch.ad.cluster.HashRing; -import org.opensearch.ad.common.exception.AnomalyDetectionException; import org.opensearch.ad.ml.ModelManager; -import org.opensearch.ad.ml.SingleStreamModelIdMapper; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.service.ClusterService; @@ -32,6 +30,8 @@ import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.tasks.Task; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.ml.SingleStreamModelIdMapper; import org.opensearch.transport.TransportException; import org.opensearch.transport.TransportRequestOptions; import org.opensearch.transport.TransportResponseHandler; @@ -69,7 +69,7 @@ public RCFPollingTransportAction( this.option = TransportRequestOptions .builder() .withType(TransportRequestOptions.Type.REG) - .withTimeout(AnomalyDetectorSettings.REQUEST_TIMEOUT.get(settings)) + .withTimeout(AnomalyDetectorSettings.AD_REQUEST_TIMEOUT.get(settings)) .build(); this.clusterService = clusterService; } @@ -83,7 +83,7 @@ protected void doExecute(Task task, RCFPollingRequest request, ActionListener rcfNode = hashRing.getOwningNodeWithSameLocalAdVersionForRealtimeAD(rcfModelID); if (!rcfNode.isPresent()) { - listener.onFailure(new AnomalyDetectionException(adID, NO_NODE_FOUND_MSG)); + listener.onFailure(new TimeSeriesException(adID, NO_NODE_FOUND_MSG)); return; } @@ -99,7 +99,7 @@ protected void doExecute(Task task, RCFPollingRequest request, ActionListener listener.onResponse(new RCFPollingResponse(totalUpdates)), - e -> listener.onFailure(new AnomalyDetectionException(adID, FAIL_TO_GET_RCF_UPDATE_MSG, e)) + e -> listener.onFailure(new TimeSeriesException(adID, FAIL_TO_GET_RCF_UPDATE_MSG, e)) ) ); } else if (request.remoteAddress() == null) { @@ -136,12 +136,12 @@ public String executor() { }); } catch (Exception e) { LOG.error(String.format(Locale.ROOT, "Fail to poll RCF models for {}", adID), e); - listener.onFailure(new AnomalyDetectionException(adID, FAIL_TO_GET_RCF_UPDATE_MSG, e)); + listener.onFailure(new TimeSeriesException(adID, FAIL_TO_GET_RCF_UPDATE_MSG, e)); } } else { LOG.error("Fail to poll rcf for model {} due to an unexpected bug.", rcfModelID); - listener.onFailure(new AnomalyDetectionException(adID, NO_NODE_FOUND_MSG)); + listener.onFailure(new TimeSeriesException(adID, NO_NODE_FOUND_MSG)); } } } diff --git a/src/main/java/org/opensearch/ad/transport/RCFResultRequest.java b/src/main/java/org/opensearch/ad/transport/RCFResultRequest.java index df5d60755..b617704b8 100644 --- a/src/main/java/org/opensearch/ad/transport/RCFResultRequest.java +++ b/src/main/java/org/opensearch/ad/transport/RCFResultRequest.java @@ -17,13 +17,14 @@ import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.common.Strings; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.constant.CommonName; public class RCFResultRequest extends ActionRequest implements ToXContentObject { private String adID; @@ -81,10 +82,10 @@ public ActionRequestValidationException validate() { validationException = addValidationError(RCFResultRequest.INVALID_FEATURE_MSG, validationException); } if (Strings.isEmpty(adID)) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.AD_ID_MISSING_MSG, validationException); } if (Strings.isEmpty(modelID)) { - validationException = addValidationError(CommonErrorMessages.MODEL_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.MODEL_ID_MISSING_MSG, validationException); } return validationException; } @@ -92,9 +93,9 @@ public ActionRequestValidationException validate() { @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { builder.startObject(); - builder.field(CommonName.ID_JSON_KEY, adID); - builder.field(CommonName.MODEL_ID_KEY, modelID); - builder.startArray(CommonName.FEATURE_JSON_KEY); + builder.field(ADCommonName.ID_JSON_KEY, adID); + builder.field(CommonName.MODEL_ID_FIELD, modelID); + builder.startArray(ADCommonName.FEATURE_JSON_KEY); for (double feature : features) { builder.value(feature); } diff --git a/src/main/java/org/opensearch/ad/transport/RCFResultResponse.java b/src/main/java/org/opensearch/ad/transport/RCFResultResponse.java index 76332d407..1a9c7fb6b 100644 --- a/src/main/java/org/opensearch/ad/transport/RCFResultResponse.java +++ b/src/main/java/org/opensearch/ad/transport/RCFResultResponse.java @@ -14,8 +14,7 @@ import java.io.IOException; import org.opensearch.Version; -import org.opensearch.ad.cluster.ADVersionUtil; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.action.ActionResponse; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; @@ -168,7 +167,7 @@ public void writeTo(StreamOutput out) throws IOException { out.writeDouble(confidence); out.writeVInt(forestSize); out.writeDoubleArray(attribution); - if (ADVersionUtil.compatibleWithVersionOnOrAfter1_1(remoteAdVersion)) { + if (remoteAdVersion != null) { out.writeLong(totalUpdates); out.writeDouble(anomalyGrade); out.writeOptionalInt(relativeIndex); @@ -210,7 +209,7 @@ public XContentBuilder toXContent(XContentBuilder builder, Params params) throws builder.field(FOREST_SIZE_JSON_KEY, forestSize); builder.field(ATTRIBUTION_JSON_KEY, attribution); builder.field(TOTAL_UPDATES_JSON_KEY, totalUpdates); - builder.field(CommonName.ANOMALY_GRADE_JSON_KEY, anomalyGrade); + builder.field(ADCommonName.ANOMALY_GRADE_JSON_KEY, anomalyGrade); builder.field(RELATIVE_INDEX_FIELD_JSON_KEY, relativeIndex); builder.field(PAST_VALUES_FIELD_JSON_KEY, pastValues); builder.field(EXPECTED_VAL_LIST_FIELD_JSON_KEY, expectedValuesList); diff --git a/src/main/java/org/opensearch/ad/transport/RCFResultTransportAction.java b/src/main/java/org/opensearch/ad/transport/RCFResultTransportAction.java index 456ac0bf8..d7df181bb 100644 --- a/src/main/java/org/opensearch/ad/transport/RCFResultTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/RCFResultTransportAction.java @@ -20,24 +20,24 @@ import org.opensearch.Version; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.cluster.HashRing; -import org.opensearch.ad.common.exception.LimitExceededException; -import org.opensearch.ad.constant.CommonErrorMessages; import org.opensearch.ad.ml.ModelManager; import org.opensearch.ad.stats.ADStats; -import org.opensearch.ad.stats.StatNames; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.common.inject.Inject; import org.opensearch.core.action.ActionListener; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.stats.StatNames; import org.opensearch.transport.TransportService; public class RCFResultTransportAction extends HandledTransportAction { private static final Logger LOG = LogManager.getLogger(RCFResultTransportAction.class); private ModelManager manager; - private ADCircuitBreakerService adCircuitBreakerService; + private CircuitBreakerService adCircuitBreakerService; private HashRing hashRing; private ADStats adStats; @@ -46,7 +46,7 @@ public RCFResultTransportAction( ActionFilters actionFilters, TransportService transportService, ModelManager manager, - ADCircuitBreakerService adCircuitBreakerService, + CircuitBreakerService adCircuitBreakerService, HashRing hashRing, ADStats adStats ) { @@ -60,7 +60,7 @@ public RCFResultTransportAction( @Override protected void doExecute(Task task, RCFResultRequest request, ActionListener listener) { if (adCircuitBreakerService.isOpen()) { - listener.onFailure(new LimitExceededException(request.getAdID(), CommonErrorMessages.MEMORY_CIRCUIT_BROKEN_ERR_MSG)); + listener.onFailure(new LimitExceededException(request.getAdID(), CommonMessages.MEMORY_CIRCUIT_BROKEN_ERR_MSG)); return; } Optional remoteNode = hashRing.getNodeByAddress(request.remoteAddress()); diff --git a/src/main/java/org/opensearch/ad/transport/SearchAnomalyDetectorInfoResponse.java b/src/main/java/org/opensearch/ad/transport/SearchAnomalyDetectorInfoResponse.java index 6f8333173..852c39d1a 100644 --- a/src/main/java/org/opensearch/ad/transport/SearchAnomalyDetectorInfoResponse.java +++ b/src/main/java/org/opensearch/ad/transport/SearchAnomalyDetectorInfoResponse.java @@ -13,12 +13,12 @@ import java.io.IOException; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.core.action.ActionResponse; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.util.RestHandlerUtils; public class SearchAnomalyDetectorInfoResponse extends ActionResponse implements ToXContentObject { private long count; diff --git a/src/main/java/org/opensearch/ad/transport/SearchAnomalyDetectorInfoTransportAction.java b/src/main/java/org/opensearch/ad/transport/SearchAnomalyDetectorInfoTransportAction.java index 3fb7d083b..b932ae601 100644 --- a/src/main/java/org/opensearch/ad/transport/SearchAnomalyDetectorInfoTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/SearchAnomalyDetectorInfoTransportAction.java @@ -11,9 +11,8 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_GET_DETECTOR_INFO; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; -import static org.opensearch.ad.util.RestHandlerUtils.wrapRestActionListener; +import static org.opensearch.timeseries.constant.CommonMessages.FAIL_TO_GET_CONFIG_INFO; +import static org.opensearch.timeseries.util.RestHandlerUtils.wrapRestActionListener; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; @@ -21,7 +20,6 @@ import org.opensearch.action.search.SearchResponse; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.inject.Inject; @@ -32,6 +30,8 @@ import org.opensearch.index.query.TermsQueryBuilder; import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.util.RestHandlerUtils; import org.opensearch.transport.TransportService; public class SearchAnomalyDetectorInfoTransportAction extends @@ -60,9 +60,9 @@ protected void doExecute( ) { String name = request.getName(); String rawPath = request.getRawPath(); - ActionListener listener = wrapRestActionListener(actionListener, FAIL_TO_GET_DETECTOR_INFO); + ActionListener listener = wrapRestActionListener(actionListener, FAIL_TO_GET_CONFIG_INFO); try (ThreadContext.StoredContext context = client.threadPool().getThreadContext().stashContext()) { - SearchRequest searchRequest = new SearchRequest().indices(ANOMALY_DETECTORS_INDEX); + SearchRequest searchRequest = new SearchRequest().indices(CommonName.CONFIG_INDEX); if (rawPath.endsWith(RestHandlerUtils.COUNT)) { // Count detectors SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder(); diff --git a/src/main/java/org/opensearch/ad/transport/SearchAnomalyResultTransportAction.java b/src/main/java/org/opensearch/ad/transport/SearchAnomalyResultTransportAction.java index f1a115741..2ae1f93d5 100644 --- a/src/main/java/org/opensearch/ad/transport/SearchAnomalyResultTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/SearchAnomalyResultTransportAction.java @@ -11,8 +11,8 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonName.CUSTOM_RESULT_INDEX_PREFIX; -import static org.opensearch.ad.indices.AnomalyDetectionIndices.ALL_AD_RESULTS_INDEX_PATTERN; +import static org.opensearch.ad.constant.ADCommonName.CUSTOM_RESULT_INDEX_PREFIX; +import static org.opensearch.ad.indices.ADIndexManagement.ALL_AD_RESULTS_INDEX_PATTERN; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_DETECTOR_UPPER_LIMIT; import java.util.ArrayList; @@ -44,6 +44,7 @@ import org.opensearch.search.aggregations.bucket.terms.TermsAggregationBuilder; import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.constant.CommonName; import org.opensearch.transport.TransportService; import com.google.common.annotations.VisibleForTesting; @@ -122,7 +123,7 @@ SearchRequest createSingleSearchRequest() { .field(AnomalyDetector.RESULT_INDEX_FIELD) .size(MAX_DETECTOR_UPPER_LIMIT); searchResultIndexBuilder.aggregation(aggregation).size(0); - return new SearchRequest(AnomalyDetector.ANOMALY_DETECTORS_INDEX).source(searchResultIndexBuilder); + return new SearchRequest(CommonName.CONFIG_INDEX).source(searchResultIndexBuilder); } @VisibleForTesting diff --git a/src/main/java/org/opensearch/ad/transport/SearchTopAnomalyResultRequest.java b/src/main/java/org/opensearch/ad/transport/SearchTopAnomalyResultRequest.java index dce0f4071..8ae80077e 100644 --- a/src/main/java/org/opensearch/ad/transport/SearchTopAnomalyResultRequest.java +++ b/src/main/java/org/opensearch/ad/transport/SearchTopAnomalyResultRequest.java @@ -20,10 +20,10 @@ import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.util.ParseUtils; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.util.ParseUtils; /** * Request for getting the top anomaly results for HC detectors. @@ -80,7 +80,7 @@ public SearchTopAnomalyResultRequest( this.endTime = endTime; } - public String getDetectorId() { + public String getId() { return detectorId; } diff --git a/src/main/java/org/opensearch/ad/transport/SearchTopAnomalyResultTransportAction.java b/src/main/java/org/opensearch/ad/transport/SearchTopAnomalyResultTransportAction.java index b49ec8760..82a1a02a3 100644 --- a/src/main/java/org/opensearch/ad/transport/SearchTopAnomalyResultTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/SearchTopAnomalyResultTransportAction.java @@ -11,7 +11,7 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.indices.AnomalyDetectionIndices.ALL_AD_RESULTS_INDEX_PATTERN; +import static org.opensearch.ad.indices.ADIndexManagement.ALL_AD_RESULTS_INDEX_PATTERN; import static org.opensearch.ad.settings.AnomalyDetectorSettings.TOP_ANOMALY_RESULT_TIMEOUT_IN_MILLIS; import java.time.Clock; @@ -31,8 +31,6 @@ import org.opensearch.action.search.SearchResponse; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.ResourceNotFoundException; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.ad.model.AnomalyResultBucket; @@ -63,6 +61,9 @@ import org.opensearch.search.sort.FieldSortBuilder; import org.opensearch.search.sort.SortOrder; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonName; import org.opensearch.transport.TransportService; import com.google.common.collect.ImmutableMap; @@ -219,7 +220,7 @@ public SearchTopAnomalyResultTransportAction( protected void doExecute(Task task, SearchTopAnomalyResultRequest request, ActionListener listener) { GetAnomalyDetectorRequest getAdRequest = new GetAnomalyDetectorRequest( - request.getDetectorId(), + request.getId(), // The default version value used in org.opensearch.rest.action.RestActions.parseVersion() -3L, false, @@ -232,16 +233,14 @@ protected void doExecute(Task task, SearchTopAnomalyResultRequest request, Actio client.execute(GetAnomalyDetectorAction.INSTANCE, getAdRequest, ActionListener.wrap(getAdResponse -> { // Make sure detector exists if (getAdResponse.getDetector() == null) { - throw new IllegalArgumentException( - String.format(Locale.ROOT, "No anomaly detector found with ID %s", request.getDetectorId()) - ); + throw new IllegalArgumentException(String.format(Locale.ROOT, "No anomaly detector found with ID %s", request.getId())); } // Make sure detector is HC - List categoryFieldsFromResponse = getAdResponse.getDetector().getCategoryField(); + List categoryFieldsFromResponse = getAdResponse.getDetector().getCategoryFields(); if (categoryFieldsFromResponse == null || categoryFieldsFromResponse.isEmpty()) { throw new IllegalArgumentException( - String.format(Locale.ROOT, "No category fields found for detector ID %s", request.getDetectorId()) + String.format(Locale.ROOT, "No category fields found for detector ID %s", request.getId()) ); } @@ -253,13 +252,7 @@ protected void doExecute(Task task, SearchTopAnomalyResultRequest request, Actio for (String categoryField : request.getCategoryFields()) { if (!categoryFieldsFromResponse.contains(categoryField)) { throw new IllegalArgumentException( - String - .format( - Locale.ROOT, - "Category field %s doesn't exist for detector ID %s", - categoryField, - request.getDetectorId() - ) + String.format(Locale.ROOT, "Category field %s doesn't exist for detector ID %s", categoryField, request.getId()) ); } } @@ -271,7 +264,7 @@ protected void doExecute(Task task, SearchTopAnomalyResultRequest request, Actio ADTask historicalTask = getAdResponse.getHistoricalAdTask(); if (historicalTask == null) { throw new ResourceNotFoundException( - String.format(Locale.ROOT, "No historical tasks found for detector ID %s", request.getDetectorId()) + String.format(Locale.ROOT, "No historical tasks found for detector ID %s", request.getId()) ); } if (Strings.isNullOrEmpty(request.getTaskId())) { @@ -309,7 +302,7 @@ protected void doExecute(Task task, SearchTopAnomalyResultRequest request, Actio SearchRequest searchRequest = generateSearchRequest(request); // Adding search over any custom result indices - String rawCustomResultIndex = getAdResponse.getDetector().getResultIndex(); + String rawCustomResultIndex = getAdResponse.getDetector().getCustomResultIndex(); String customResultIndex = rawCustomResultIndex == null ? null : rawCustomResultIndex.trim(); if (!Strings.isNullOrEmpty(customResultIndex)) { searchRequest.indices(defaultIndex, customResultIndex); @@ -423,7 +416,7 @@ public void onResponse(SearchResponse response) { listener.onResponse(new SearchTopAnomalyResultResponse(getDescendingOrderListFromHeap(topResultsHeap))); } else if (expirationEpochMs < clock.millis()) { if (topResultsHeap.isEmpty()) { - listener.onFailure(new AnomalyDetectionException("Timed out getting all top anomaly results. Please retry later.")); + listener.onFailure(new TimeSeriesException("Timed out getting all top anomaly results. Please retry later.")); } else { logger.info("Timed out getting all top anomaly results. Sending back partial results."); listener.onResponse(new SearchTopAnomalyResultResponse(getDescendingOrderListFromHeap(topResultsHeap))); @@ -486,18 +479,18 @@ private QueryBuilder generateQuery(SearchTopAnomalyResultRequest request) { // Adding the date range and anomaly grade filters (needed regardless of real-time or historical) RangeQueryBuilder dateRangeFilter = QueryBuilders - .rangeQuery(AnomalyResult.DATA_END_TIME_FIELD) + .rangeQuery(CommonName.DATA_END_TIME_FIELD) .gte(request.getStartTime().toEpochMilli()) .lte(request.getEndTime().toEpochMilli()); RangeQueryBuilder anomalyGradeFilter = QueryBuilders.rangeQuery(AnomalyResult.ANOMALY_GRADE_FIELD).gt(0); query.filter(dateRangeFilter).filter(anomalyGradeFilter); if (request.getHistorical() == true) { - TermQueryBuilder taskIdFilter = QueryBuilders.termQuery(AnomalyResult.TASK_ID_FIELD, request.getTaskId()); + TermQueryBuilder taskIdFilter = QueryBuilders.termQuery(CommonName.TASK_ID_FIELD, request.getTaskId()); query.filter(taskIdFilter); } else { - TermQueryBuilder detectorIdFilter = QueryBuilders.termQuery(AnomalyResult.DETECTOR_ID_FIELD, request.getDetectorId()); - ExistsQueryBuilder taskIdExistsFilter = QueryBuilders.existsQuery(AnomalyResult.TASK_ID_FIELD); + TermQueryBuilder detectorIdFilter = QueryBuilders.termQuery(AnomalyResult.DETECTOR_ID_FIELD, request.getId()); + ExistsQueryBuilder taskIdExistsFilter = QueryBuilders.existsQuery(CommonName.TASK_ID_FIELD); query.filter(detectorIdFilter).mustNot(taskIdExistsFilter); } return query; diff --git a/src/main/java/org/opensearch/ad/transport/StatsAnomalyDetectorTransportAction.java b/src/main/java/org/opensearch/ad/transport/StatsAnomalyDetectorTransportAction.java index 69b61583e..ebf4016cf 100644 --- a/src/main/java/org/opensearch/ad/transport/StatsAnomalyDetectorTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/StatsAnomalyDetectorTransportAction.java @@ -11,8 +11,8 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_GET_STATS; -import static org.opensearch.ad.util.RestHandlerUtils.wrapRestActionListener; +import static org.opensearch.timeseries.constant.CommonMessages.FAIL_TO_GET_STATS; +import static org.opensearch.timeseries.util.RestHandlerUtils.wrapRestActionListener; import java.util.HashMap; import java.util.List; @@ -29,8 +29,6 @@ import org.opensearch.ad.model.AnomalyDetectorType; import org.opensearch.ad.stats.ADStats; import org.opensearch.ad.stats.ADStatsResponse; -import org.opensearch.ad.stats.StatNames; -import org.opensearch.ad.util.MultiResponsesDelegateActionListener; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.inject.Inject; @@ -42,6 +40,9 @@ import org.opensearch.search.aggregations.bucket.terms.TermsAggregationBuilder; import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.stats.StatNames; +import org.opensearch.timeseries.util.MultiResponsesDelegateActionListener; import org.opensearch.transport.TransportService; public class StatsAnomalyDetectorTransportAction extends HandledTransportAction { @@ -129,11 +130,11 @@ private void getClusterStats( if ((adStatsRequest.getStatsToBeRetrieved().contains(StatNames.DETECTOR_COUNT.getName()) || adStatsRequest.getStatsToBeRetrieved().contains(StatNames.SINGLE_ENTITY_DETECTOR_COUNT.getName()) || adStatsRequest.getStatsToBeRetrieved().contains(StatNames.MULTI_ENTITY_DETECTOR_COUNT.getName())) - && clusterService.state().getRoutingTable().hasIndex(AnomalyDetector.ANOMALY_DETECTORS_INDEX)) { + && clusterService.state().getRoutingTable().hasIndex(CommonName.CONFIG_INDEX)) { TermsAggregationBuilder termsAgg = AggregationBuilders.terms(DETECTOR_TYPE_AGG).field(AnomalyDetector.DETECTOR_TYPE_FIELD); SearchRequest request = new SearchRequest() - .indices(AnomalyDetector.ANOMALY_DETECTORS_INDEX) + .indices(CommonName.CONFIG_INDEX) .source(new SearchSourceBuilder().aggregation(termsAgg).size(0).trackTotalHits(true)); client.search(request, ActionListener.wrap(r -> { diff --git a/src/main/java/org/opensearch/ad/transport/StopDetectorRequest.java b/src/main/java/org/opensearch/ad/transport/StopDetectorRequest.java index 2b3fb419e..71563a2cd 100644 --- a/src/main/java/org/opensearch/ad/transport/StopDetectorRequest.java +++ b/src/main/java/org/opensearch/ad/transport/StopDetectorRequest.java @@ -19,8 +19,8 @@ import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.common.Strings; import org.opensearch.core.common.io.stream.InputStreamStreamInput; import org.opensearch.core.common.io.stream.OutputStreamStreamOutput; @@ -64,7 +64,7 @@ public void writeTo(StreamOutput out) throws IOException { public ActionRequestValidationException validate() { ActionRequestValidationException validationException = null; if (Strings.isEmpty(adID)) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.AD_ID_MISSING_MSG, validationException); } return validationException; } @@ -72,7 +72,7 @@ public ActionRequestValidationException validate() { @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { builder.startObject(); - builder.field(CommonName.ID_JSON_KEY, adID); + builder.field(ADCommonName.ID_JSON_KEY, adID); builder.endObject(); return builder; } diff --git a/src/main/java/org/opensearch/ad/transport/StopDetectorTransportAction.java b/src/main/java/org/opensearch/ad/transport/StopDetectorTransportAction.java index 6dd303d0e..deafd8854 100644 --- a/src/main/java/org/opensearch/ad/transport/StopDetectorTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/StopDetectorTransportAction.java @@ -11,7 +11,7 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_STOP_DETECTOR; +import static org.opensearch.ad.constant.ADCommonMessages.FAIL_TO_STOP_DETECTOR; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; @@ -21,13 +21,13 @@ import org.opensearch.action.FailedNodeException; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; -import org.opensearch.ad.common.exception.InternalFailure; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.Client; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.common.inject.Inject; import org.opensearch.core.action.ActionListener; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.common.exception.InternalFailure; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; import org.opensearch.transport.TransportService; public class StopDetectorTransportAction extends HandledTransportAction { diff --git a/src/main/java/org/opensearch/ad/transport/ThresholdResultRequest.java b/src/main/java/org/opensearch/ad/transport/ThresholdResultRequest.java index 46ab51c27..d9310ae31 100644 --- a/src/main/java/org/opensearch/ad/transport/ThresholdResultRequest.java +++ b/src/main/java/org/opensearch/ad/transport/ThresholdResultRequest.java @@ -17,13 +17,14 @@ import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.common.Strings; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.constant.CommonName; public class ThresholdResultRequest extends ActionRequest implements ToXContentObject { private String adID; @@ -72,10 +73,10 @@ public void writeTo(StreamOutput out) throws IOException { public ActionRequestValidationException validate() { ActionRequestValidationException validationException = null; if (Strings.isEmpty(adID)) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.AD_ID_MISSING_MSG, validationException); } if (Strings.isEmpty(modelID)) { - validationException = addValidationError(CommonErrorMessages.MODEL_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.MODEL_ID_MISSING_MSG, validationException); } return validationException; @@ -84,9 +85,9 @@ public ActionRequestValidationException validate() { @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { builder.startObject(); - builder.field(CommonName.ID_JSON_KEY, adID); - builder.field(CommonName.MODEL_ID_KEY, modelID); - builder.field(CommonName.RCF_SCORE_JSON_KEY, rcfScore); + builder.field(ADCommonName.ID_JSON_KEY, adID); + builder.field(CommonName.MODEL_ID_FIELD, modelID); + builder.field(ADCommonName.RCF_SCORE_JSON_KEY, rcfScore); builder.endObject(); return builder; } diff --git a/src/main/java/org/opensearch/ad/transport/ThresholdResultResponse.java b/src/main/java/org/opensearch/ad/transport/ThresholdResultResponse.java index b3b66fd57..0747395f9 100644 --- a/src/main/java/org/opensearch/ad/transport/ThresholdResultResponse.java +++ b/src/main/java/org/opensearch/ad/transport/ThresholdResultResponse.java @@ -13,7 +13,7 @@ import java.io.IOException; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.action.ActionResponse; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; @@ -52,8 +52,8 @@ public void writeTo(StreamOutput out) throws IOException { @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { builder.startObject(); - builder.field(CommonName.ANOMALY_GRADE_JSON_KEY, anomalyGrade); - builder.field(CommonName.CONFIDENCE_JSON_KEY, confidence); + builder.field(ADCommonName.ANOMALY_GRADE_JSON_KEY, anomalyGrade); + builder.field(ADCommonName.CONFIDENCE_JSON_KEY, confidence); builder.endObject(); return builder; } diff --git a/src/main/java/org/opensearch/ad/transport/ValidateAnomalyDetectorTransportAction.java b/src/main/java/org/opensearch/ad/transport/ValidateAnomalyDetectorTransportAction.java index 1afb8cd13..16eec43ac 100644 --- a/src/main/java/org/opensearch/ad/transport/ValidateAnomalyDetectorTransportAction.java +++ b/src/main/java/org/opensearch/ad/transport/ValidateAnomalyDetectorTransportAction.java @@ -11,9 +11,9 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES; -import static org.opensearch.ad.util.ParseUtils.checkFilterByBackendRoles; -import static org.opensearch.ad.util.ParseUtils.getUserContext; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES; +import static org.opensearch.timeseries.util.ParseUtils.checkFilterByBackendRoles; +import static org.opensearch.timeseries.util.ParseUtils.getUserContext; import java.time.Clock; import java.util.HashMap; @@ -26,19 +26,12 @@ import org.opensearch.action.search.SearchRequest; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; -import org.opensearch.ad.common.exception.ADValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.feature.SearchFeatureDao; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.DetectorValidationIssue; -import org.opensearch.ad.model.DetectorValidationIssueType; -import org.opensearch.ad.model.IntervalTimeConfiguration; -import org.opensearch.ad.model.ValidationAspect; -import org.opensearch.ad.rest.handler.AnomalyDetectorFunction; import org.opensearch.ad.rest.handler.ValidateAnomalyDetectorActionHandler; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.inject.Inject; @@ -52,6 +45,13 @@ import org.opensearch.rest.RestRequest; import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.common.exception.ValidationException; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.ValidationAspect; +import org.opensearch.timeseries.model.ValidationIssueType; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.TransportService; public class ValidateAnomalyDetectorTransportAction extends @@ -62,7 +62,7 @@ public class ValidateAnomalyDetectorTransportAction extends private final SecurityClientUtil clientUtil; private final ClusterService clusterService; private final NamedXContentRegistry xContentRegistry; - private final AnomalyDetectionIndices anomalyDetectionIndices; + private final ADIndexManagement anomalyDetectionIndices; private final SearchFeatureDao searchFeatureDao; private volatile Boolean filterByEnabled; private Clock clock; @@ -75,7 +75,7 @@ public ValidateAnomalyDetectorTransportAction( ClusterService clusterService, NamedXContentRegistry xContentRegistry, Settings settings, - AnomalyDetectionIndices anomalyDetectionIndices, + ADIndexManagement anomalyDetectionIndices, ActionFilters actionFilters, TransportService transportService, SearchFeatureDao searchFeatureDao @@ -86,8 +86,8 @@ public ValidateAnomalyDetectorTransportAction( this.clusterService = clusterService; this.xContentRegistry = xContentRegistry; this.anomalyDetectionIndices = anomalyDetectionIndices; - this.filterByEnabled = AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); + this.filterByEnabled = AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); this.searchFeatureDao = searchFeatureDao; this.clock = Clock.systemUTC(); this.settings = settings; @@ -108,7 +108,7 @@ protected void doExecute(Task task, ValidateAnomalyDetectorRequest request, Acti private void resolveUserAndExecute( User requestedUser, ActionListener listener, - AnomalyDetectorFunction function + ExecutorFunction function ) { try { // Check if user has backend roles @@ -136,9 +136,9 @@ private void validateExecute( // forcing response to be empty listener.onResponse(new ValidateAnomalyDetectorResponse((DetectorValidationIssue) null)); }, exception -> { - if (exception instanceof ADValidationException) { + if (exception instanceof ValidationException) { // ADValidationException is converted as validation issues returned as response to user - DetectorValidationIssue issue = parseADValidationException((ADValidationException) exception); + DetectorValidationIssue issue = parseADValidationException((ValidationException) exception); listener.onResponse(new ValidateAnomalyDetectorResponse(issue)); return; } @@ -176,7 +176,7 @@ private void validateExecute( }, listener); } - protected DetectorValidationIssue parseADValidationException(ADValidationException exception) { + protected DetectorValidationIssue parseADValidationException(ValidationException exception) { String originalErrorMessage = exception.getMessage(); String errorMessage = ""; Map subIssues = null; @@ -231,7 +231,7 @@ private Map getFeatureSubIssuesFromErrorMessage(String errorMess private void checkIndicesAndExecute( List indices, - AnomalyDetectorFunction function, + ExecutorFunction function, ActionListener listener ) { SearchRequest searchRequest = new SearchRequest() @@ -243,11 +243,7 @@ private void checkIndicesAndExecute( // parsed to a DetectorValidationIssue that is returned to // the user as a response indicating index doesn't exist DetectorValidationIssue issue = parseADValidationException( - new ADValidationException( - CommonErrorMessages.INDEX_NOT_FOUND, - DetectorValidationIssueType.INDICES, - ValidationAspect.DETECTOR - ) + new ValidationException(ADCommonMessages.INDEX_NOT_FOUND, ValidationIssueType.INDICES, ValidationAspect.DETECTOR) ); listener.onResponse(new ValidateAnomalyDetectorResponse(issue)); return; diff --git a/src/main/java/org/opensearch/ad/transport/handler/ADSearchHandler.java b/src/main/java/org/opensearch/ad/transport/handler/ADSearchHandler.java index 0f1082e7f..05c69196d 100644 --- a/src/main/java/org/opensearch/ad/transport/handler/ADSearchHandler.java +++ b/src/main/java/org/opensearch/ad/transport/handler/ADSearchHandler.java @@ -11,12 +11,12 @@ package org.opensearch.ad.transport.handler; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_SEARCH; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES; -import static org.opensearch.ad.util.ParseUtils.addUserBackendRolesFilter; -import static org.opensearch.ad.util.ParseUtils.getUserContext; -import static org.opensearch.ad.util.ParseUtils.isAdmin; -import static org.opensearch.ad.util.RestHandlerUtils.wrapRestActionListener; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES; +import static org.opensearch.timeseries.constant.CommonMessages.FAIL_TO_SEARCH; +import static org.opensearch.timeseries.util.ParseUtils.addUserBackendRolesFilter; +import static org.opensearch.timeseries.util.ParseUtils.getUserContext; +import static org.opensearch.timeseries.util.ParseUtils.isAdmin; +import static org.opensearch.timeseries.util.RestHandlerUtils.wrapRestActionListener; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; @@ -40,8 +40,8 @@ public class ADSearchHandler { public ADSearchHandler(Settings settings, ClusterService clusterService, Client client) { this.client = client; - filterEnabled = AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(FILTER_BY_BACKEND_ROLES, it -> filterEnabled = it); + filterEnabled = AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_FILTER_BY_BACKEND_ROLES, it -> filterEnabled = it); } /** diff --git a/src/main/java/org/opensearch/ad/transport/handler/AnomalyIndexHandler.java b/src/main/java/org/opensearch/ad/transport/handler/AnomalyIndexHandler.java index 6948be72a..9d539f797 100644 --- a/src/main/java/org/opensearch/ad/transport/handler/AnomalyIndexHandler.java +++ b/src/main/java/org/opensearch/ad/transport/handler/AnomalyIndexHandler.java @@ -11,8 +11,8 @@ package org.opensearch.ad.transport.handler; -import static org.opensearch.ad.constant.CommonErrorMessages.CAN_NOT_FIND_RESULT_INDEX; import static org.opensearch.common.xcontent.XContentFactory.jsonBuilder; +import static org.opensearch.timeseries.constant.CommonMessages.CAN_NOT_FIND_RESULT_INDEX; import java.util.Iterator; import java.util.Locale; @@ -25,14 +25,10 @@ import org.opensearch.action.bulk.BackoffPolicy; import org.opensearch.action.index.IndexRequest; import org.opensearch.action.index.IndexResponse; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.util.BulkUtil; -import org.opensearch.ad.util.ClientUtil; import org.opensearch.ad.util.IndexUtils; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Client; import org.opensearch.cluster.block.ClusterBlockLevel; import org.opensearch.cluster.service.ClusterService; @@ -43,6 +39,10 @@ import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.util.ClientUtil; +import org.opensearch.timeseries.util.RestHandlerUtils; public class AnomalyIndexHandler { private static final Logger LOG = LogManager.getLogger(AnomalyIndexHandler.class); @@ -56,7 +56,7 @@ public class AnomalyIndexHandler { protected final ThreadPool threadPool; protected final BackoffPolicy savingBackoffPolicy; protected final String indexName; - protected final AnomalyDetectionIndices anomalyDetectionIndices; + protected final ADIndexManagement anomalyDetectionIndices; // whether save to a specific doc id or not. False by default. protected boolean fixedDoc; protected final ClientUtil clientUtil; @@ -80,7 +80,7 @@ public AnomalyIndexHandler( Settings settings, ThreadPool threadPool, String indexName, - AnomalyDetectionIndices anomalyDetectionIndices, + ADIndexManagement anomalyDetectionIndices, ClientUtil clientUtil, IndexUtils indexUtils, ClusterService clusterService @@ -89,8 +89,8 @@ public AnomalyIndexHandler( this.threadPool = threadPool; this.savingBackoffPolicy = BackoffPolicy .exponentialBackoff( - AnomalyDetectorSettings.BACKOFF_INITIAL_DELAY.get(settings), - AnomalyDetectorSettings.MAX_RETRY_FOR_BACKOFF.get(settings) + AnomalyDetectorSettings.AD_BACKOFF_INITIAL_DELAY.get(settings), + AnomalyDetectorSettings.AD_MAX_RETRY_FOR_BACKOFF.get(settings) ); this.indexName = indexName; this.anomalyDetectionIndices = anomalyDetectionIndices; @@ -131,15 +131,15 @@ public void index(T toSave, String detectorId, String customIndexName) { save(toSave, detectorId, customIndexName); return; } - if (!anomalyDetectionIndices.doesDefaultAnomalyResultIndexExist()) { + if (!anomalyDetectionIndices.doesDefaultResultIndexExist()) { anomalyDetectionIndices - .initDefaultAnomalyResultIndexDirectly( + .initDefaultResultIndexDirectly( ActionListener.wrap(initResponse -> onCreateIndexResponse(initResponse, toSave, detectorId), exception -> { if (ExceptionsHelper.unwrapCause(exception) instanceof ResourceAlreadyExistsException) { // It is possible the index has been created while we sending the create request save(toSave, detectorId); } else { - throw new AnomalyDetectionException( + throw new TimeSeriesException( detectorId, String.format(Locale.ROOT, "Unexpected error creating index %s", indexName), exception @@ -151,7 +151,7 @@ public void index(T toSave, String detectorId, String customIndexName) { save(toSave, detectorId); } } catch (Exception e) { - throw new AnomalyDetectionException( + throw new TimeSeriesException( detectorId, String.format(Locale.ROOT, "Error in saving %s for detector %s", indexName, detectorId), e @@ -163,7 +163,7 @@ private void onCreateIndexResponse(CreateIndexResponse response, T toSave, Strin if (response.isAcknowledged()) { save(toSave, detectorId); } else { - throw new AnomalyDetectionException( + throw new TimeSeriesException( detectorId, String.format(Locale.ROOT, "Creating %s with mappings call not acknowledged.", indexName) ); @@ -188,7 +188,7 @@ protected void save(T toSave, String detectorId, String indexName) { saveIteration(indexRequest, detectorId, savingBackoffPolicy.iterator()); } catch (Exception e) { LOG.error(String.format(Locale.ROOT, "Failed to save %s", indexName), e); - throw new AnomalyDetectionException(detectorId, String.format(Locale.ROOT, "Cannot save %s", indexName)); + throw new TimeSeriesException(detectorId, String.format(Locale.ROOT, "Cannot save %s", indexName)); } } diff --git a/src/main/java/org/opensearch/ad/transport/handler/AnomalyResultBulkIndexHandler.java b/src/main/java/org/opensearch/ad/transport/handler/AnomalyResultBulkIndexHandler.java index 3d78975aa..d61fd1794 100644 --- a/src/main/java/org/opensearch/ad/transport/handler/AnomalyResultBulkIndexHandler.java +++ b/src/main/java/org/opensearch/ad/transport/handler/AnomalyResultBulkIndexHandler.java @@ -11,9 +11,9 @@ package org.opensearch.ad.transport.handler; -import static org.opensearch.ad.constant.CommonErrorMessages.CAN_NOT_FIND_RESULT_INDEX; -import static org.opensearch.ad.constant.CommonName.ANOMALY_RESULT_INDEX_ALIAS; +import static org.opensearch.ad.constant.ADCommonName.ANOMALY_RESULT_INDEX_ALIAS; import static org.opensearch.common.xcontent.XContentFactory.jsonBuilder; +import static org.opensearch.timeseries.constant.CommonMessages.CAN_NOT_FIND_RESULT_INDEX; import java.util.List; @@ -24,24 +24,24 @@ import org.opensearch.action.bulk.BulkRequestBuilder; import org.opensearch.action.bulk.BulkResponse; import org.opensearch.action.index.IndexRequest; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.util.ClientUtil; import org.opensearch.ad.util.IndexUtils; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.util.ClientUtil; +import org.opensearch.timeseries.util.RestHandlerUtils; public class AnomalyResultBulkIndexHandler extends AnomalyIndexHandler { private static final Logger LOG = LogManager.getLogger(AnomalyResultBulkIndexHandler.class); - private AnomalyDetectionIndices anomalyDetectionIndices; + private ADIndexManagement anomalyDetectionIndices; public AnomalyResultBulkIndexHandler( Client client, @@ -50,7 +50,7 @@ public AnomalyResultBulkIndexHandler( ClientUtil clientUtil, IndexUtils indexUtils, ClusterService clusterService, - AnomalyDetectionIndices anomalyDetectionIndices + ADIndexManagement anomalyDetectionIndices ) { super(client, settings, threadPool, ANOMALY_RESULT_INDEX_ALIAS, anomalyDetectionIndices, clientUtil, indexUtils, clusterService); this.anomalyDetectionIndices = anomalyDetectionIndices; @@ -68,7 +68,7 @@ public void bulkIndexAnomalyResult(String resultIndex, List anoma listener.onResponse(null); return; } - String detectorId = anomalyResults.get(0).getDetectorId(); + String detectorId = anomalyResults.get(0).getConfigId(); try { if (resultIndex != null) { // Only create custom AD result index when create detector, won’t recreate custom AD result index in realtime @@ -83,14 +83,14 @@ public void bulkIndexAnomalyResult(String resultIndex, List anoma bulkSaveDetectorResult(resultIndex, anomalyResults, listener); return; } - if (!anomalyDetectionIndices.doesDefaultAnomalyResultIndexExist()) { - anomalyDetectionIndices.initDefaultAnomalyResultIndexDirectly(ActionListener.wrap(response -> { + if (!anomalyDetectionIndices.doesDefaultResultIndexExist()) { + anomalyDetectionIndices.initDefaultResultIndexDirectly(ActionListener.wrap(response -> { if (response.isAcknowledged()) { bulkSaveDetectorResult(anomalyResults, listener); } else { String error = "Creating anomaly result index with mappings call not acknowledged"; LOG.error(error); - listener.onFailure(new AnomalyDetectionException(error)); + listener.onFailure(new TimeSeriesException(error)); } }, exception -> { if (ExceptionsHelper.unwrapCause(exception) instanceof ResourceAlreadyExistsException) { @@ -103,12 +103,12 @@ public void bulkIndexAnomalyResult(String resultIndex, List anoma } else { bulkSaveDetectorResult(anomalyResults, listener); } - } catch (AnomalyDetectionException e) { + } catch (TimeSeriesException e) { listener.onFailure(e); } catch (Exception e) { String error = "Failed to bulk index anomaly result"; LOG.error(error, e); - listener.onFailure(new AnomalyDetectionException(error, e)); + listener.onFailure(new TimeSeriesException(error, e)); } } @@ -126,14 +126,14 @@ private void bulkSaveDetectorResult(String resultIndex, List anom } catch (Exception e) { String error = "Failed to prepare request to bulk index anomaly results"; LOG.error(error, e); - throw new AnomalyDetectionException(error); + throw new TimeSeriesException(error); } }); client.bulk(bulkRequestBuilder.request(), ActionListener.wrap(r -> { if (r.hasFailures()) { String failureMessage = r.buildFailureMessage(); LOG.warn("Failed to bulk index AD result " + failureMessage); - listener.onFailure(new AnomalyDetectionException(failureMessage)); + listener.onFailure(new TimeSeriesException(failureMessage)); } else { listener.onResponse(r); } diff --git a/src/main/java/org/opensearch/ad/transport/handler/MultiEntityResultHandler.java b/src/main/java/org/opensearch/ad/transport/handler/MultiEntityResultHandler.java index 3ea95f1cf..13f7e16e7 100644 --- a/src/main/java/org/opensearch/ad/transport/handler/MultiEntityResultHandler.java +++ b/src/main/java/org/opensearch/ad/transport/handler/MultiEntityResultHandler.java @@ -15,14 +15,12 @@ import org.apache.logging.log4j.Logger; import org.opensearch.ExceptionsHelper; import org.opensearch.ResourceAlreadyExistsException; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.ad.transport.ADResultBulkAction; import org.opensearch.ad.transport.ADResultBulkRequest; import org.opensearch.ad.transport.ADResultBulkResponse; -import org.opensearch.ad.util.ClientUtil; import org.opensearch.ad.util.IndexUtils; import org.opensearch.client.Client; import org.opensearch.cluster.block.ClusterBlockLevel; @@ -31,6 +29,8 @@ import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.util.ClientUtil; /** * EntityResultTransportAction depends on this class. I cannot use @@ -52,7 +52,7 @@ public MultiEntityResultHandler( Client client, Settings settings, ThreadPool threadPool, - AnomalyDetectionIndices anomalyDetectionIndices, + ADIndexManagement anomalyDetectionIndices, ClientUtil clientUtil, IndexUtils indexUtils, ClusterService clusterService @@ -61,7 +61,7 @@ public MultiEntityResultHandler( client, settings, threadPool, - CommonName.ANOMALY_RESULT_INDEX_ALIAS, + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, anomalyDetectionIndices, clientUtil, indexUtils, @@ -76,18 +76,18 @@ public MultiEntityResultHandler( */ public void flush(ADResultBulkRequest currentBulkRequest, ActionListener listener) { if (indexUtils.checkIndicesBlocked(clusterService.state(), ClusterBlockLevel.WRITE, this.indexName)) { - listener.onFailure(new AnomalyDetectionException(CANNOT_SAVE_RESULT_ERR_MSG)); + listener.onFailure(new TimeSeriesException(CANNOT_SAVE_RESULT_ERR_MSG)); return; } try { - if (!anomalyDetectionIndices.doesDefaultAnomalyResultIndexExist()) { - anomalyDetectionIndices.initDefaultAnomalyResultIndexDirectly(ActionListener.wrap(initResponse -> { + if (!anomalyDetectionIndices.doesDefaultResultIndexExist()) { + anomalyDetectionIndices.initDefaultResultIndexDirectly(ActionListener.wrap(initResponse -> { if (initResponse.isAcknowledged()) { bulk(currentBulkRequest, listener); } else { LOG.warn("Creating result index with mappings call not acknowledged."); - listener.onFailure(new AnomalyDetectionException("", "Creating result index with mappings call not acknowledged.")); + listener.onFailure(new TimeSeriesException("", "Creating result index with mappings call not acknowledged.")); } }, exception -> { if (ExceptionsHelper.unwrapCause(exception) instanceof ResourceAlreadyExistsException) { @@ -109,7 +109,7 @@ public void flush(ADResultBulkRequest currentBulkRequest, ActionListener listener) { if (currentBulkRequest.numberOfActions() <= 0) { - listener.onFailure(new AnomalyDetectionException("no result to save")); + listener.onFailure(new TimeSeriesException("no result to save")); return; } client.execute(ADResultBulkAction.INSTANCE, currentBulkRequest, ActionListener.wrap(response -> { diff --git a/src/main/java/org/opensearch/ad/util/BulkUtil.java b/src/main/java/org/opensearch/ad/util/BulkUtil.java index d7fe9c6f6..b754b1951 100644 --- a/src/main/java/org/opensearch/ad/util/BulkUtil.java +++ b/src/main/java/org/opensearch/ad/util/BulkUtil.java @@ -23,6 +23,7 @@ import org.opensearch.action.bulk.BulkRequest; import org.opensearch.action.bulk.BulkResponse; import org.opensearch.action.index.IndexRequest; +import org.opensearch.timeseries.util.ExceptionUtil; public class BulkUtil { private static final Logger logger = LogManager.getLogger(BulkUtil.class); diff --git a/src/main/java/org/opensearch/ad/util/Bwc.java b/src/main/java/org/opensearch/ad/util/Bwc.java deleted file mode 100644 index 6f921b0e2..000000000 --- a/src/main/java/org/opensearch/ad/util/Bwc.java +++ /dev/null @@ -1,32 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.util; - -import org.opensearch.Version; - -/** - * A helper class for various feature backward compatibility test - * - */ -public class Bwc { - public static boolean DISABLE_BWC = true; - - /** - * We are gonna start supporting multi-category fields since version 1.1.0. - * - * @param version test version - * @return whether the version support multiple category fields - */ - public static boolean supportMultiCategoryFields(Version version) { - return version.after(Version.V_1_0_0); - } -} diff --git a/src/main/java/org/opensearch/ad/util/ClientUtil.java b/src/main/java/org/opensearch/ad/util/ClientUtil.java deleted file mode 100644 index f3f6ef68d..000000000 --- a/src/main/java/org/opensearch/ad/util/ClientUtil.java +++ /dev/null @@ -1,318 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.util; - -import static org.opensearch.ad.settings.AnomalyDetectorSettings.REQUEST_TIMEOUT; - -import java.util.List; -import java.util.Optional; -import java.util.concurrent.CountDownLatch; -import java.util.concurrent.TimeUnit; -import java.util.concurrent.atomic.AtomicReference; -import java.util.function.BiConsumer; -import java.util.function.Function; - -import org.apache.logging.log4j.Logger; -import org.opensearch.OpenSearchException; -import org.opensearch.OpenSearchTimeoutException; -import org.opensearch.action.ActionRequest; -import org.opensearch.action.ActionType; -import org.opensearch.action.LatchedActionListener; -import org.opensearch.action.TaskOperationFailure; -import org.opensearch.action.admin.cluster.node.tasks.cancel.CancelTasksAction; -import org.opensearch.action.admin.cluster.node.tasks.cancel.CancelTasksRequest; -import org.opensearch.action.admin.cluster.node.tasks.cancel.CancelTasksResponse; -import org.opensearch.action.admin.cluster.node.tasks.list.ListTasksAction; -import org.opensearch.action.admin.cluster.node.tasks.list.ListTasksRequest; -import org.opensearch.action.admin.cluster.node.tasks.list.ListTasksResponse; -import org.opensearch.ad.common.exception.InternalFailure; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.client.Client; -import org.opensearch.common.action.ActionFuture; -import org.opensearch.common.inject.Inject; -import org.opensearch.common.settings.Settings; -import org.opensearch.common.unit.TimeValue; -import org.opensearch.common.util.concurrent.ThreadContext; -import org.opensearch.core.action.ActionListener; -import org.opensearch.core.action.ActionResponse; -import org.opensearch.core.tasks.TaskId; -import org.opensearch.tasks.Task; -import org.opensearch.tasks.TaskInfo; -import org.opensearch.threadpool.ThreadPool; - -public class ClientUtil { - private volatile TimeValue requestTimeout; - private Client client; - private final Throttler throttler; - private ThreadPool threadPool; - - @Inject - public ClientUtil(Settings setting, Client client, Throttler throttler, ThreadPool threadPool) { - this.requestTimeout = REQUEST_TIMEOUT.get(setting); - this.client = client; - this.throttler = throttler; - this.threadPool = threadPool; - } - - /** - * Send a nonblocking request with a timeout and return response. Blocking is not allowed in a - * transport call context. See BaseFuture.blockingAllowed - * @param request request like index/search/get - * @param LOG log - * @param consumer functional interface to operate as a client request like client::get - * @param ActionRequest - * @param ActionResponse - * @return the response - * @throws OpenSearchTimeoutException when we cannot get response within time. - * @throws IllegalStateException when the waiting thread is interrupted - */ - public Optional timedRequest( - Request request, - Logger LOG, - BiConsumer> consumer - ) { - try { - AtomicReference respReference = new AtomicReference<>(); - final CountDownLatch latch = new CountDownLatch(1); - - consumer - .accept( - request, - new LatchedActionListener(ActionListener.wrap(response -> { respReference.set(response); }, exception -> { - LOG.error("Cannot get response for request {}, error: {}", request, exception); - }), latch) - ); - - if (!latch.await(requestTimeout.getSeconds(), TimeUnit.SECONDS)) { - throw new OpenSearchTimeoutException("Cannot get response within time limit: " + request.toString()); - } - return Optional.ofNullable(respReference.get()); - } catch (InterruptedException e1) { - LOG.error(CommonErrorMessages.WAIT_ERR_MSG); - throw new IllegalStateException(e1); - } - } - - /** - * Send an asynchronous request and handle response with the provided listener. - * @param ActionRequest - * @param ActionResponse - * @param request request body - * @param consumer request method, functional interface to operate as a client request like client::get - * @param listener needed to handle response - */ - public void asyncRequest( - Request request, - BiConsumer> consumer, - ActionListener listener - ) { - consumer - .accept( - request, - ActionListener.wrap(response -> { listener.onResponse(response); }, exception -> { listener.onFailure(exception); }) - ); - } - - /** - * Execute a transport action and handle response with the provided listener. - * @param ActionRequest - * @param ActionResponse - * @param action transport action - * @param request request body - * @param listener needed to handle response - */ - public void execute( - ActionType action, - Request request, - ActionListener listener - ) { - client.execute(action, request, ActionListener.wrap(response -> { listener.onResponse(response); }, exception -> { - listener.onFailure(exception); - })); - } - - /** - * Send an synchronous request and handle response with the provided listener. - * - * @deprecated use asyncRequest with listener instead. - * - * @param ActionRequest - * @param ActionResponse - * @param request request body - * @param function request method, functional interface to operate as a client request like client::get - * @return the response - */ - @Deprecated - public Response syncRequest( - Request request, - Function> function - ) { - return function.apply(request).actionGet(requestTimeout); - } - - /** - * Send a nonblocking request with a timeout and return response. - * If there is already a query running on given detector, it will try to - * cancel the query. Otherwise it will add this query to the negative cache - * and then attach the AnomalyDetection specific header to the request. - * Once the request complete, it will be removed from the negative cache. - * @param ActionRequest - * @param ActionResponse - * @param request request like index/search/get - * @param LOG log - * @param consumer functional interface to operate as a client request like client::get - * @param detector Anomaly Detector - * @return the response - * @throws InternalFailure when there is already a query running - * @throws OpenSearchTimeoutException when we cannot get response within time. - * @throws IllegalStateException when the waiting thread is interrupted - */ - public Optional throttledTimedRequest( - Request request, - Logger LOG, - BiConsumer> consumer, - AnomalyDetector detector - ) { - - try { - String detectorId = detector.getDetectorId(); - if (!throttler.insertFilteredQuery(detectorId, request)) { - LOG.info("There is one query running for detectorId: {}. Trying to cancel the long running query", detectorId); - cancelRunningQuery(client, detectorId, LOG); - throw new InternalFailure(detector.getDetectorId(), "There is already a query running on AnomalyDetector"); - } - AtomicReference respReference = new AtomicReference<>(); - final CountDownLatch latch = new CountDownLatch(1); - - try (ThreadContext.StoredContext context = threadPool.getThreadContext().stashContext()) { - assert context != null; - threadPool.getThreadContext().putHeader(Task.X_OPAQUE_ID, CommonName.ANOMALY_DETECTOR + ":" + detectorId); - consumer.accept(request, new LatchedActionListener(ActionListener.wrap(response -> { - // clear negative cache - throttler.clearFilteredQuery(detectorId); - respReference.set(response); - }, exception -> { - // clear negative cache - throttler.clearFilteredQuery(detectorId); - LOG.error("Cannot get response for request {}, error: {}", request, exception); - }), latch)); - } catch (Exception e) { - LOG.error("Failed to process the request for detectorId: {}.", detectorId); - throttler.clearFilteredQuery(detectorId); - throw e; - } - - if (!latch.await(requestTimeout.getSeconds(), TimeUnit.SECONDS)) { - throw new OpenSearchTimeoutException("Cannot get response within time limit: " + request.toString()); - } - return Optional.ofNullable(respReference.get()); - } catch (InterruptedException e1) { - LOG.error(CommonErrorMessages.WAIT_ERR_MSG); - throw new IllegalStateException(e1); - } - } - - /** - * Check if there is running query on given detector - * @param detector Anomaly Detector - * @return true if given detector has a running query else false - */ - public boolean hasRunningQuery(AnomalyDetector detector) { - return throttler.getFilteredQuery(detector.getDetectorId()).isPresent(); - } - - /** - * Cancel long running query for given detectorId - * @param client OpenSearch client - * @param detectorId Anomaly Detector Id - * @param LOG Logger - */ - private void cancelRunningQuery(Client client, String detectorId, Logger LOG) { - ListTasksRequest listTasksRequest = new ListTasksRequest(); - listTasksRequest.setActions("*search*"); - client.execute(ListTasksAction.INSTANCE, listTasksRequest, ActionListener.wrap(response -> { - onListTaskResponse(response, detectorId, LOG); - }, exception -> { - LOG.error("List Tasks failed.", exception); - throw new InternalFailure(detectorId, "Failed to list current tasks", exception); - })); - } - - /** - * Helper function to handle ListTasksResponse - * @param listTasksResponse ListTasksResponse - * @param detectorId Anomaly Detector Id - * @param LOG Logger - */ - private void onListTaskResponse(ListTasksResponse listTasksResponse, String detectorId, Logger LOG) { - List tasks = listTasksResponse.getTasks(); - TaskId matchedParentTaskId = null; - TaskId matchedSingleTaskId = null; - for (TaskInfo task : tasks) { - if (!task.getHeaders().isEmpty() - && task.getHeaders().get(Task.X_OPAQUE_ID).equals(CommonName.ANOMALY_DETECTOR + ":" + detectorId)) { - if (!task.getParentTaskId().equals(TaskId.EMPTY_TASK_ID)) { - // we found the parent task, don't need to check more - matchedParentTaskId = task.getParentTaskId(); - break; - } else { - // we found one task, keep checking other tasks - matchedSingleTaskId = task.getTaskId(); - } - } - } - // case 1: given detectorId is not in current task list - if (matchedParentTaskId == null && matchedSingleTaskId == null) { - // log and then clear negative cache - LOG.info("Couldn't find task for detectorId: {}. Clean this entry from Throttler", detectorId); - throttler.clearFilteredQuery(detectorId); - return; - } - // case 2: we can find the task for given detectorId - CancelTasksRequest cancelTaskRequest = new CancelTasksRequest(); - if (matchedParentTaskId != null) { - cancelTaskRequest.setParentTaskId(matchedParentTaskId); - LOG.info("Start to cancel task for parentTaskId: {}", matchedParentTaskId.toString()); - } else { - cancelTaskRequest.setTaskId(matchedSingleTaskId); - LOG.info("Start to cancel task for taskId: {}", matchedSingleTaskId.toString()); - } - - client.execute(CancelTasksAction.INSTANCE, cancelTaskRequest, ActionListener.wrap(response -> { - onCancelTaskResponse(response, detectorId, LOG); - }, exception -> { - LOG.error("Failed to cancel task for detectorId: " + detectorId, exception); - throw new InternalFailure(detectorId, "Failed to cancel current tasks", exception); - })); - } - - /** - * Helper function to handle CancelTasksResponse - * @param cancelTasksResponse CancelTasksResponse - * @param detectorId Anomaly Detector Id - * @param LOG Logger - */ - private void onCancelTaskResponse(CancelTasksResponse cancelTasksResponse, String detectorId, Logger LOG) { - // todo: adding retry mechanism - List nodeFailures = cancelTasksResponse.getNodeFailures(); - List taskFailures = cancelTasksResponse.getTaskFailures(); - if (nodeFailures.isEmpty() && taskFailures.isEmpty()) { - LOG.info("Cancelling query for detectorId: {} succeeds. Clear entry from Throttler", detectorId); - throttler.clearFilteredQuery(detectorId); - return; - } - LOG.error("Failed to cancel task for detectorId: " + detectorId); - throw new InternalFailure(detectorId, "Failed to cancel current tasks due to node or task failures"); - } -} diff --git a/src/main/java/org/opensearch/ad/util/IndexUtils.java b/src/main/java/org/opensearch/ad/util/IndexUtils.java index b69c0924a..c93511849 100644 --- a/src/main/java/org/opensearch/ad/util/IndexUtils.java +++ b/src/main/java/org/opensearch/ad/util/IndexUtils.java @@ -13,12 +13,9 @@ import java.util.List; import java.util.Locale; -import java.util.Optional; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.action.admin.indices.stats.IndicesStatsRequest; -import org.opensearch.action.admin.indices.stats.IndicesStatsResponse; import org.opensearch.action.support.IndicesOptions; import org.opensearch.client.Client; import org.opensearch.cluster.ClusterState; @@ -28,6 +25,7 @@ import org.opensearch.cluster.metadata.IndexNameExpressionResolver; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.inject.Inject; +import org.opensearch.timeseries.util.ClientUtil; public class IndexUtils { /** @@ -110,25 +108,6 @@ public String getIndexHealthStatus(String indexOrAliasName) throws IllegalArgume return indexHealth.getStatus().name().toLowerCase(Locale.ROOT); } - /** - * Gets the number of documents in an index. - * - * @deprecated - * - * @param indexName Name of the index - * @return The number of documents in an index. 0 is returned if the index does not exist. -1 is returned if the - * request fails. - */ - @Deprecated - public Long getNumberOfDocumentsInIndex(String indexName) { - if (!clusterService.state().getRoutingTable().hasIndex(indexName)) { - return 0L; - } - IndicesStatsRequest indicesStatsRequest = new IndicesStatsRequest(); - Optional response = clientUtil.timedRequest(indicesStatsRequest, logger, client.admin().indices()::stats); - return response.map(r -> r.getIndex(indexName).getPrimaries().docs.getCount()).orElse(-1L); - } - /** * Similar to checkGlobalBlock, we check block on the indices level. * diff --git a/src/main/java/org/opensearch/ad/util/Throttler.java b/src/main/java/org/opensearch/ad/util/Throttler.java deleted file mode 100644 index 177b612a2..000000000 --- a/src/main/java/org/opensearch/ad/util/Throttler.java +++ /dev/null @@ -1,73 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.util; - -import java.time.Clock; -import java.time.Instant; -import java.util.AbstractMap; -import java.util.Map; -import java.util.Optional; -import java.util.concurrent.ConcurrentHashMap; - -import org.opensearch.action.ActionRequest; - -/** - * Utility functions for throttling query. - */ -public class Throttler { - // negativeCache is used to reject search query if given detector already has one query running - // key is detectorId, value is an entry. Key is ActionRequest and value is the timestamp - private final ConcurrentHashMap> negativeCache; - private final Clock clock; - - public Throttler(Clock clock) { - this.negativeCache = new ConcurrentHashMap<>(); - this.clock = clock; - } - - /** - * This will be used when dependency injection directly/indirectly injects a Throttler object. Without this object, - * node start might fail due to not being able to find a Clock object. We removed Clock object association in - * https://github.com/opendistro-for-elasticsearch/anomaly-detection/pull/305 - */ - public Throttler() { - this(Clock.systemUTC()); - } - - /** - * Get negative cache value(ActionRequest, Instant) for given detector - * @param detectorId AnomalyDetector ID - * @return negative cache value(ActionRequest, Instant) - */ - public Optional> getFilteredQuery(String detectorId) { - return Optional.ofNullable(negativeCache.get(detectorId)); - } - - /** - * Insert the negative cache entry for given detector - * If key already exists, return false. Otherwise true. - * @param detectorId AnomalyDetector ID - * @param request ActionRequest - * @return true if key doesn't exist otherwise false. - */ - public synchronized boolean insertFilteredQuery(String detectorId, ActionRequest request) { - return negativeCache.putIfAbsent(detectorId, new AbstractMap.SimpleEntry<>(request, clock.instant())) == null; - } - - /** - * Clear the negative cache for given detector. - * @param detectorId AnomalyDetector ID - */ - public void clearFilteredQuery(String detectorId) { - negativeCache.remove(detectorId); - } -} diff --git a/src/main/java/org/opensearch/forecast/constant/ForecastCommonMessages.java b/src/main/java/org/opensearch/forecast/constant/ForecastCommonMessages.java new file mode 100644 index 000000000..deb31cad7 --- /dev/null +++ b/src/main/java/org/opensearch/forecast/constant/ForecastCommonMessages.java @@ -0,0 +1,62 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.forecast.constant; + +import static org.opensearch.forecast.constant.ForecastCommonName.CUSTOM_RESULT_INDEX_PREFIX; + +public class ForecastCommonMessages { + // ====================================== + // Validation message + // ====================================== + public static String INVALID_FORECAST_INTERVAL = "Forecast interval must be a positive integer"; + public static String NULL_FORECAST_INTERVAL = "Forecast interval should be set"; + public static String INVALID_FORECASTER_NAME = + "Valid characters for forecaster name are a-z, A-Z, 0-9, -(hyphen), _(underscore) and .(period)"; + + // ====================================== + // Resource constraints + // ====================================== + public static final String DISABLED_ERR_MSG = "Forecast functionality is disabled. To enable update plugins.forecast.enabled to true"; + + // ====================================== + // RESTful API + // ====================================== + public static String FAIL_TO_CREATE_FORECASTER = "Failed to create forecaster"; + public static String FAIL_TO_UPDATE_FORECASTER = "Failed to update forecaster"; + public static String FAIL_TO_FIND_FORECASTER_MSG = "Can not find forecaster with id: "; + public static final String FORECASTER_ID_MISSING_MSG = "Forecaster ID is missing"; + public static final String INVALID_TIMESTAMP_ERR_MSG = "timestamp is invalid"; + public static String FAIL_TO_GET_FORECASTER = "Fail to get forecaster"; + + // ====================================== + // Security + // ====================================== + public static String NO_PERMISSION_TO_ACCESS_FORECASTER = "User does not have permissions to access forecaster: "; + public static String FAIL_TO_GET_USER_INFO = "Unable to get user information from forecaster "; + + // ====================================== + // Used for custom forecast result index + // ====================================== + public static String CAN_NOT_FIND_RESULT_INDEX = "Can't find result index "; + public static String INVALID_RESULT_INDEX_PREFIX = "Result index must start with " + CUSTOM_RESULT_INDEX_PREFIX; + + // ====================================== + // Task + // ====================================== + public static String FORECASTER_IS_RUNNING = "Forecaster is already running"; + + // ====================================== + // Job + // ====================================== + public static String FAIL_TO_START_FORECASTER = "Fail to start forecaster"; + public static String FAIL_TO_STOP_FORECASTER = "Fail to stop forecaster"; +} diff --git a/src/main/java/org/opensearch/forecast/constant/ForecastCommonName.java b/src/main/java/org/opensearch/forecast/constant/ForecastCommonName.java new file mode 100644 index 000000000..8edaf2d2b --- /dev/null +++ b/src/main/java/org/opensearch/forecast/constant/ForecastCommonName.java @@ -0,0 +1,48 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.forecast.constant; + +public class ForecastCommonName { + // ====================================== + // Validation + // ====================================== + // detector validation aspect + public static final String FORECASTER_ASPECT = "forecaster"; + + // ====================================== + // Used for custom forecast result index + // ====================================== + public static final String DUMMY_FORECAST_RESULT_ID = "dummy_forecast_result_id"; + public static final String DUMMY_FORECASTER_ID = "dummy_forecaster_id"; + public static final String CUSTOM_RESULT_INDEX_PREFIX = "opensearch-forecast-result-"; + + // ====================================== + // Index name + // ====================================== + // index name for forecast checkpoint of each model. One model one document. + public static final String FORECAST_CHECKPOINT_INDEX_NAME = ".opensearch-forecast-checkpoints"; + // index name for forecast state. Will store forecast task in this index as well. + public static final String FORECAST_STATE_INDEX = ".opensearch-forecast-state"; + // The alias of the index in which to write forecast result history. Not a hidden index. + // Allow users to create dashboard or query freely on top of it. + public static final String FORECAST_RESULT_INDEX_ALIAS = "opensearch-forecast-results"; + + // ====================================== + // Used in toXContent + // ====================================== + public static final String ID_JSON_KEY = "forecasterID"; + + // ====================================== + // Used in stats API + // ====================================== + public static final String FORECASTER_ID_KEY = "forecaster_id"; +} diff --git a/src/main/java/org/opensearch/forecast/constant/ForecastCommonValue.java b/src/main/java/org/opensearch/forecast/constant/ForecastCommonValue.java new file mode 100644 index 000000000..27a4de5ed --- /dev/null +++ b/src/main/java/org/opensearch/forecast/constant/ForecastCommonValue.java @@ -0,0 +1,17 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.forecast.constant; + +public class ForecastCommonValue { + public static String INTERNAL_ACTION_PREFIX = "cluster:admin/plugin/forecastinternal/"; + public static String EXTERNAL_ACTION_PREFIX = "cluster:admin/plugin/forecast/"; +} diff --git a/src/main/java/org/opensearch/forecast/indices/ForecastIndex.java b/src/main/java/org/opensearch/forecast/indices/ForecastIndex.java new file mode 100644 index 000000000..8e514dd6e --- /dev/null +++ b/src/main/java/org/opensearch/forecast/indices/ForecastIndex.java @@ -0,0 +1,72 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.forecast.indices; + +import java.util.function.Supplier; + +import org.opensearch.ad.indices.ADIndexManagement; +import org.opensearch.forecast.constant.ForecastCommonName; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.function.ThrowingSupplierWrapper; +import org.opensearch.timeseries.indices.TimeSeriesIndex; + +public enum ForecastIndex implements TimeSeriesIndex { + // throw RuntimeException since we don't know how to handle the case when the mapping reading throws IOException + RESULT( + ForecastCommonName.FORECAST_RESULT_INDEX_ALIAS, + true, + ThrowingSupplierWrapper.throwingSupplierWrapper(ForecastIndexManagement::getResultMappings) + ), + CONFIG(CommonName.CONFIG_INDEX, false, ThrowingSupplierWrapper.throwingSupplierWrapper(ADIndexManagement::getConfigMappings)), + JOB(CommonName.JOB_INDEX, false, ThrowingSupplierWrapper.throwingSupplierWrapper(ADIndexManagement::getJobMappings)), + CHECKPOINT( + ForecastCommonName.FORECAST_CHECKPOINT_INDEX_NAME, + false, + ThrowingSupplierWrapper.throwingSupplierWrapper(ForecastIndexManagement::getCheckpointMappings) + ), + STATE( + ForecastCommonName.FORECAST_STATE_INDEX, + false, + ThrowingSupplierWrapper.throwingSupplierWrapper(ForecastIndexManagement::getStateMappings) + ); + + private final String indexName; + // whether we use an alias for the index + private final boolean alias; + private final String mapping; + + ForecastIndex(String name, boolean alias, Supplier mappingSupplier) { + this.indexName = name; + this.alias = alias; + this.mapping = mappingSupplier.get(); + } + + @Override + public String getIndexName() { + return indexName; + } + + @Override + public boolean isAlias() { + return alias; + } + + @Override + public String getMapping() { + return mapping; + } + + @Override + public boolean isJobIndex() { + return CommonName.JOB_INDEX.equals(indexName); + } +} diff --git a/src/main/java/org/opensearch/forecast/indices/ForecastIndexManagement.java b/src/main/java/org/opensearch/forecast/indices/ForecastIndexManagement.java new file mode 100644 index 000000000..e7d3f3252 --- /dev/null +++ b/src/main/java/org/opensearch/forecast/indices/ForecastIndexManagement.java @@ -0,0 +1,271 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.forecast.indices; + +import static org.opensearch.forecast.constant.ForecastCommonName.DUMMY_FORECAST_RESULT_ID; +import static org.opensearch.forecast.settings.ForecastSettings.FORECAST_CHECKPOINT_INDEX_MAPPING_FILE; +import static org.opensearch.forecast.settings.ForecastSettings.FORECAST_MAX_PRIMARY_SHARDS; +import static org.opensearch.forecast.settings.ForecastSettings.FORECAST_RESULTS_INDEX_MAPPING_FILE; +import static org.opensearch.forecast.settings.ForecastSettings.FORECAST_RESULT_HISTORY_MAX_DOCS_PER_SHARD; +import static org.opensearch.forecast.settings.ForecastSettings.FORECAST_RESULT_HISTORY_RETENTION_PERIOD; +import static org.opensearch.forecast.settings.ForecastSettings.FORECAST_RESULT_HISTORY_ROLLOVER_PERIOD; +import static org.opensearch.forecast.settings.ForecastSettings.FORECAST_STATE_INDEX_MAPPING_FILE; + +import java.io.IOException; +import java.util.EnumMap; + +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; +import org.opensearch.action.admin.indices.create.CreateIndexRequest; +import org.opensearch.action.admin.indices.create.CreateIndexResponse; +import org.opensearch.action.delete.DeleteRequest; +import org.opensearch.action.index.IndexRequest; +import org.opensearch.client.Client; +import org.opensearch.cluster.service.ClusterService; +import org.opensearch.common.settings.Settings; +import org.opensearch.common.xcontent.XContentType; +import org.opensearch.core.action.ActionListener; +import org.opensearch.core.xcontent.ToXContent; +import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.forecast.constant.ForecastCommonName; +import org.opensearch.forecast.model.ForecastResult; +import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.indices.IndexManagement; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; + +public class ForecastIndexManagement extends IndexManagement { + private static final Logger logger = LogManager.getLogger(ForecastIndexManagement.class); + + // The index name pattern to query all the forecast result history indices + public static final String FORECAST_RESULT_HISTORY_INDEX_PATTERN = ""; + + // The index name pattern to query all forecast results, history and current forecast results + public static final String ALL_FORECAST_RESULTS_INDEX_PATTERN = "opensearch-forecast-results*"; + + /** + * Constructor function + * + * @param client OS client supports administrative actions + * @param clusterService OS cluster service + * @param threadPool OS thread pool + * @param settings OS cluster setting + * @param nodeFilter Used to filter eligible nodes to host forecast indices + * @param maxUpdateRunningTimes max number of retries to update index mapping and setting + * @throws IOException + */ + public ForecastIndexManagement( + Client client, + ClusterService clusterService, + ThreadPool threadPool, + Settings settings, + DiscoveryNodeFilterer nodeFilter, + int maxUpdateRunningTimes + ) + throws IOException { + super( + client, + clusterService, + threadPool, + settings, + nodeFilter, + maxUpdateRunningTimes, + ForecastIndex.class, + FORECAST_MAX_PRIMARY_SHARDS.get(settings), + FORECAST_RESULT_HISTORY_ROLLOVER_PERIOD.get(settings), + FORECAST_RESULT_HISTORY_MAX_DOCS_PER_SHARD.get(settings), + FORECAST_RESULT_HISTORY_RETENTION_PERIOD.get(settings), + ForecastIndex.RESULT.getMapping() + ); + this.indexStates = new EnumMap(ForecastIndex.class); + + this.clusterService + .getClusterSettings() + .addSettingsUpdateConsumer(FORECAST_RESULT_HISTORY_MAX_DOCS_PER_SHARD, it -> historyMaxDocs = it); + + this.clusterService.getClusterSettings().addSettingsUpdateConsumer(FORECAST_RESULT_HISTORY_ROLLOVER_PERIOD, it -> { + historyRolloverPeriod = it; + rescheduleRollover(); + }); + this.clusterService.getClusterSettings().addSettingsUpdateConsumer(FORECAST_RESULT_HISTORY_RETENTION_PERIOD, it -> { + historyRetentionPeriod = it; + }); + + this.clusterService.getClusterSettings().addSettingsUpdateConsumer(FORECAST_MAX_PRIMARY_SHARDS, it -> maxPrimaryShards = it); + + this.updateRunningTimes = 0; + } + + /** + * Get forecast result index mapping json content. + * + * @return forecast result index mapping + * @throws IOException IOException if mapping file can't be read correctly + */ + public static String getResultMappings() throws IOException { + return getMappings(FORECAST_RESULTS_INDEX_MAPPING_FILE); + } + + /** + * Get forecaster state index mapping json content. + * + * @return forecaster state index mapping + * @throws IOException IOException if mapping file can't be read correctly + */ + public static String getStateMappings() throws IOException { + String forecastStateMappings = getMappings(FORECAST_STATE_INDEX_MAPPING_FILE); + String forecasterIndexMappings = getConfigMappings(); + forecasterIndexMappings = forecasterIndexMappings + .substring(forecasterIndexMappings.indexOf("\"properties\""), forecasterIndexMappings.lastIndexOf("}")); + return forecastStateMappings.replace("FORECASTER_INDEX_MAPPING_PLACE_HOLDER", forecasterIndexMappings); + } + + /** + * Get checkpoint index mapping json content. + * + * @return checkpoint index mapping + * @throws IOException IOException if mapping file can't be read correctly + */ + public static String getCheckpointMappings() throws IOException { + return getMappings(FORECAST_CHECKPOINT_INDEX_MAPPING_FILE); + } + + /** + * default forecaster result index exist or not. + * + * @return true if default forecaster result index exists + */ + @Override + public boolean doesDefaultResultIndexExist() { + return doesAliasExist(ForecastCommonName.FORECAST_RESULT_INDEX_ALIAS); + } + + /** + * Forecast state index exist or not. + * + * @return true if forecast state index exists + */ + @Override + public boolean doesStateIndexExist() { + return doesIndexExist(ForecastCommonName.FORECAST_STATE_INDEX); + } + + /** + * Checkpoint index exist or not. + * + * @return true if checkpoint index exists + */ + @Override + public boolean doesCheckpointIndexExist() { + return doesIndexExist(ForecastCommonName.FORECAST_CHECKPOINT_INDEX_NAME); + } + + /** + * Create the state index. + * + * @param actionListener action called after create index + */ + @Override + public void initStateIndex(ActionListener actionListener) { + try { + CreateIndexRequest request = new CreateIndexRequest(ForecastCommonName.FORECAST_STATE_INDEX) + .mapping(getStateMappings(), XContentType.JSON) + .settings(settings); + adminClient.indices().create(request, markMappingUpToDate(ForecastIndex.STATE, actionListener)); + } catch (IOException e) { + logger.error("Fail to init AD detection state index", e); + actionListener.onFailure(e); + } + } + + /** + * Create the checkpoint index. + * + * @param actionListener action called after create index + * @throws EndRunException EndRunException due to failure to get mapping + */ + @Override + public void initCheckpointIndex(ActionListener actionListener) { + String mapping; + try { + mapping = getCheckpointMappings(); + } catch (IOException e) { + throw new EndRunException("", "Cannot find checkpoint mapping file", true); + } + CreateIndexRequest request = new CreateIndexRequest(ForecastCommonName.FORECAST_CHECKPOINT_INDEX_NAME) + .mapping(mapping, XContentType.JSON); + choosePrimaryShards(request, true); + adminClient.indices().create(request, markMappingUpToDate(ForecastIndex.CHECKPOINT, actionListener)); + } + + @Override + protected void rolloverAndDeleteHistoryIndex() { + rolloverAndDeleteHistoryIndex( + ForecastCommonName.FORECAST_RESULT_INDEX_ALIAS, + ALL_FORECAST_RESULTS_INDEX_PATTERN, + FORECAST_RESULT_HISTORY_INDEX_PATTERN, + ForecastIndex.RESULT + ); + } + + /** + * Create config index directly. + * + * @param actionListener action called after create index + * @throws IOException IOException from {@link IndexManagement#getConfigMappings} + */ + @Override + public void initConfigIndex(ActionListener actionListener) throws IOException { + super.initConfigIndex(markMappingUpToDate(ForecastIndex.CONFIG, actionListener)); + } + + /** + * Create config index. + * + * @param actionListener action called after create index + */ + @Override + public void initJobIndex(ActionListener actionListener) { + super.initJobIndex(markMappingUpToDate(ForecastIndex.JOB, actionListener)); + } + + @Override + protected IndexRequest createDummyIndexRequest(String resultIndex) throws IOException { + ForecastResult dummyResult = ForecastResult.getDummyResult(); + return new IndexRequest(resultIndex) + .id(DUMMY_FORECAST_RESULT_ID) + .source(dummyResult.toXContent(XContentBuilder.builder(XContentType.JSON.xContent()), ToXContent.EMPTY_PARAMS)); + } + + @Override + protected DeleteRequest createDummyDeleteRequest(String resultIndex) throws IOException { + return new DeleteRequest(resultIndex).id(DUMMY_FORECAST_RESULT_ID); + } + + @Override + public void initDefaultResultIndexDirectly(ActionListener actionListener) { + initResultIndexDirectly( + FORECAST_RESULT_HISTORY_INDEX_PATTERN, + ForecastIndex.RESULT.getIndexName(), + false, + FORECAST_RESULT_HISTORY_INDEX_PATTERN, + ForecastIndex.RESULT, + actionListener + ); + } + + @Override + public void initCustomResultIndexDirectly(String resultIndex, ActionListener actionListener) { + // throws IOException { + initResultIndexDirectly(resultIndex, null, false, FORECAST_RESULT_HISTORY_INDEX_PATTERN, ForecastIndex.RESULT, actionListener); + } +} diff --git a/src/main/java/org/opensearch/forecast/model/ForecastResult.java b/src/main/java/org/opensearch/forecast/model/ForecastResult.java new file mode 100644 index 000000000..1ce75ff63 --- /dev/null +++ b/src/main/java/org/opensearch/forecast/model/ForecastResult.java @@ -0,0 +1,590 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.forecast.model; + +import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.forecast.constant.ForecastCommonName.DUMMY_FORECASTER_ID; + +import java.io.IOException; +import java.time.Instant; +import java.util.ArrayList; +import java.util.List; +import java.util.Optional; + +import org.apache.commons.lang.builder.ToStringBuilder; +import org.opensearch.commons.authuser.User; +import org.opensearch.core.ParseField; +import org.opensearch.core.common.io.stream.StreamInput; +import org.opensearch.core.common.io.stream.StreamOutput; +import org.opensearch.core.xcontent.NamedXContentRegistry; +import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.forecast.constant.ForecastCommonName; +import org.opensearch.timeseries.annotation.Generated; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.constant.CommonValue; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.FeatureData; +import org.opensearch.timeseries.model.IndexableResult; +import org.opensearch.timeseries.util.ParseUtils; + +import com.google.common.base.Objects; + +/** + * Include result returned from RCF model and feature data. + */ +public class ForecastResult extends IndexableResult { + public static final String PARSE_FIELD_NAME = "ForecastResult"; + public static final NamedXContentRegistry.Entry XCONTENT_REGISTRY = new NamedXContentRegistry.Entry( + ForecastResult.class, + new ParseField(PARSE_FIELD_NAME), + it -> parse(it) + ); + + public static final String FEATURE_ID_FIELD = "feature_id"; + public static final String VALUE_FIELD = "forecast_value"; + public static final String LOWER_BOUND_FIELD = "forecast_lower_bound"; + public static final String UPPER_BOUND_FIELD = "forecast_upper_bound"; + public static final String INTERVAL_WIDTH_FIELD = "confidence_interval_width"; + public static final String FORECAST_DATA_START_TIME_FIELD = "forecast_data_start_time"; + public static final String FORECAST_DATA_END_TIME_FIELD = "forecast_data_end_time"; + public static final String HORIZON_INDEX_FIELD = "horizon_index"; + + private final String featureId; + private final Float forecastValue; + private final Float lowerBound; + private final Float upperBound; + private final Float confidenceIntervalWidth; + private final Instant forecastDataStartTime; + private final Instant forecastDataEndTime; + private final Integer horizonIndex; + protected final Double dataQuality; + + // used when indexing exception or error or an empty result + public ForecastResult( + String forecasterId, + String taskId, + List featureData, + Instant dataStartTime, + Instant dataEndTime, + Instant executionStartTime, + Instant executionEndTime, + String error, + Optional entity, + User user, + Integer schemaVersion, + String modelId + ) { + this( + forecasterId, + taskId, + Double.NaN, + featureData, + dataStartTime, + dataEndTime, + executionStartTime, + executionEndTime, + error, + entity, + user, + schemaVersion, + modelId, + null, + null, + null, + null, + null, + null, + null + ); + } + + public ForecastResult( + String forecasterId, + String taskId, + Double dataQuality, + List featureData, + Instant dataStartTime, + Instant dataEndTime, + Instant executionStartTime, + Instant executionEndTime, + String error, + Optional entity, + User user, + Integer schemaVersion, + String modelId, + String featureId, + Float forecastValue, + Float lowerBound, + Float upperBound, + Instant forecastDataStartTime, + Instant forecastDataEndTime, + Integer horizonIndex + ) { + super( + forecasterId, + featureData, + dataStartTime, + dataEndTime, + executionStartTime, + executionEndTime, + error, + entity, + user, + schemaVersion, + modelId, + taskId + ); + this.featureId = featureId; + this.dataQuality = dataQuality; + this.forecastValue = forecastValue; + this.lowerBound = lowerBound; + this.upperBound = upperBound; + this.confidenceIntervalWidth = lowerBound != null && upperBound != null ? Math.abs(upperBound - lowerBound) : Float.NaN; + this.forecastDataStartTime = forecastDataStartTime; + this.forecastDataEndTime = forecastDataEndTime; + this.horizonIndex = horizonIndex; + } + + public static List fromRawRCFCasterResult( + String forecasterId, + long intervalMillis, + Double dataQuality, + List featureData, + Instant dataStartTime, + Instant dataEndTime, + Instant executionStartTime, + Instant executionEndTime, + String error, + Optional entity, + User user, + Integer schemaVersion, + String modelId, + float[] forecastsValues, + float[] forecastsUppers, + float[] forecastsLowers, + String taskId + ) { + int inputLength = featureData.size(); + int numberOfForecasts = forecastsValues.length / inputLength; + + List convertedForecastValues = new ArrayList<>(numberOfForecasts); + + // store feature data and forecast value separately for easy query on feature data + // we can join them using forecasterId, entityId, and executionStartTime/executionEndTime + convertedForecastValues + .add( + new ForecastResult( + forecasterId, + taskId, + dataQuality, + featureData, + dataStartTime, + dataEndTime, + executionStartTime, + executionEndTime, + error, + entity, + user, + schemaVersion, + modelId, + null, + null, + null, + null, + null, + null, + -1 + ) + ); + Instant forecastDataStartTime = dataEndTime; + + for (int i = 0; i < numberOfForecasts; i++) { + Instant forecastDataEndTime = forecastDataStartTime.plusMillis(intervalMillis); + for (int j = 0; j < inputLength; j++) { + int k = i * inputLength + j; + convertedForecastValues + .add( + new ForecastResult( + forecasterId, + taskId, + dataQuality, + null, + null, + null, + executionStartTime, + executionEndTime, + error, + entity, + user, + schemaVersion, + modelId, + featureData.get(j).getFeatureId(), + forecastsValues[k], + forecastsLowers[k], + forecastsUppers[k], + forecastDataStartTime, + forecastDataEndTime, + i + ) + ); + } + forecastDataStartTime = forecastDataEndTime; + } + + return convertedForecastValues; + } + + public ForecastResult(StreamInput input) throws IOException { + super(input); + this.featureId = input.readOptionalString(); + this.dataQuality = input.readOptionalDouble(); + this.forecastValue = input.readOptionalFloat(); + this.lowerBound = input.readOptionalFloat(); + this.upperBound = input.readOptionalFloat(); + this.confidenceIntervalWidth = input.readOptionalFloat(); + this.forecastDataStartTime = input.readOptionalInstant(); + this.forecastDataEndTime = input.readOptionalInstant(); + this.horizonIndex = input.readOptionalInt(); + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + XContentBuilder xContentBuilder = builder + .startObject() + .field(ForecastCommonName.FORECASTER_ID_KEY, configId) + .field(CommonName.SCHEMA_VERSION_FIELD, schemaVersion); + + if (dataStartTime != null) { + xContentBuilder.field(CommonName.DATA_START_TIME_FIELD, dataStartTime.toEpochMilli()); + } + if (dataEndTime != null) { + xContentBuilder.field(CommonName.DATA_END_TIME_FIELD, dataEndTime.toEpochMilli()); + } + if (featureData != null) { + // can be null during preview + xContentBuilder.field(CommonName.FEATURE_DATA_FIELD, featureData.toArray()); + } + if (executionStartTime != null) { + // can be null during preview + xContentBuilder.field(CommonName.EXECUTION_START_TIME_FIELD, executionStartTime.toEpochMilli()); + } + if (executionEndTime != null) { + // can be null during preview + xContentBuilder.field(CommonName.EXECUTION_END_TIME_FIELD, executionEndTime.toEpochMilli()); + } + if (error != null) { + xContentBuilder.field(CommonName.ERROR_FIELD, error); + } + if (optionalEntity.isPresent()) { + xContentBuilder.field(CommonName.ENTITY_FIELD, optionalEntity.get()); + } + if (user != null) { + xContentBuilder.field(CommonName.USER_FIELD, user); + } + if (modelId != null) { + xContentBuilder.field(CommonName.MODEL_ID_FIELD, modelId); + } + if (dataQuality != null && !dataQuality.isNaN()) { + xContentBuilder.field(CommonName.DATA_QUALITY_FIELD, dataQuality); + } + if (taskId != null) { + xContentBuilder.field(CommonName.TASK_ID_FIELD, taskId); + } + if (entityId != null) { + xContentBuilder.field(CommonName.ENTITY_ID_FIELD, entityId); + } + if (forecastValue != null) { + xContentBuilder.field(VALUE_FIELD, forecastValue); + } + if (lowerBound != null) { + xContentBuilder.field(LOWER_BOUND_FIELD, lowerBound); + } + if (upperBound != null) { + xContentBuilder.field(UPPER_BOUND_FIELD, upperBound); + } + if (forecastDataStartTime != null) { + xContentBuilder.field(FORECAST_DATA_START_TIME_FIELD, forecastDataStartTime.toEpochMilli()); + } + if (forecastDataEndTime != null) { + xContentBuilder.field(FORECAST_DATA_END_TIME_FIELD, forecastDataEndTime.toEpochMilli()); + } + if (horizonIndex != null) { + xContentBuilder.field(HORIZON_INDEX_FIELD, horizonIndex); + } + if (featureId != null) { + xContentBuilder.field(FEATURE_ID_FIELD, featureId); + } + + return xContentBuilder.endObject(); + } + + public static ForecastResult parse(XContentParser parser) throws IOException { + String forecasterId = null; + Double dataQuality = null; + List featureData = null; + Instant dataStartTime = null; + Instant dataEndTime = null; + Instant executionStartTime = null; + Instant executionEndTime = null; + String error = null; + Entity entity = null; + User user = null; + Integer schemaVersion = CommonValue.NO_SCHEMA_VERSION; + String modelId = null; + String taskId = null; + + String featureId = null; + Float forecastValue = null; + Float lowerBound = null; + Float upperBound = null; + Instant forecastDataStartTime = null; + Instant forecastDataEndTime = null; + Integer horizonIndex = null; + + ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.currentToken(), parser); + while (parser.nextToken() != XContentParser.Token.END_OBJECT) { + String fieldName = parser.currentName(); + parser.nextToken(); + + switch (fieldName) { + case ForecastCommonName.FORECASTER_ID_KEY: + forecasterId = parser.text(); + break; + case CommonName.DATA_QUALITY_FIELD: + dataQuality = parser.doubleValue(); + break; + case CommonName.FEATURE_DATA_FIELD: + ensureExpectedToken(XContentParser.Token.START_ARRAY, parser.currentToken(), parser); + featureData = new ArrayList<>(); + while (parser.nextToken() != XContentParser.Token.END_ARRAY) { + featureData.add(FeatureData.parse(parser)); + } + break; + case CommonName.DATA_START_TIME_FIELD: + dataStartTime = ParseUtils.toInstant(parser); + break; + case CommonName.DATA_END_TIME_FIELD: + dataEndTime = ParseUtils.toInstant(parser); + break; + case CommonName.EXECUTION_START_TIME_FIELD: + executionStartTime = ParseUtils.toInstant(parser); + break; + case CommonName.EXECUTION_END_TIME_FIELD: + executionEndTime = ParseUtils.toInstant(parser); + break; + case CommonName.ERROR_FIELD: + error = parser.text(); + break; + case CommonName.ENTITY_FIELD: + entity = Entity.parse(parser); + break; + case CommonName.USER_FIELD: + user = User.parse(parser); + break; + case CommonName.SCHEMA_VERSION_FIELD: + schemaVersion = parser.intValue(); + break; + case CommonName.MODEL_ID_FIELD: + modelId = parser.text(); + break; + case FEATURE_ID_FIELD: + featureId = parser.text(); + break; + case LOWER_BOUND_FIELD: + lowerBound = parser.floatValue(); + break; + case UPPER_BOUND_FIELD: + upperBound = parser.floatValue(); + break; + case VALUE_FIELD: + forecastValue = parser.floatValue(); + break; + case FORECAST_DATA_START_TIME_FIELD: + forecastDataStartTime = ParseUtils.toInstant(parser); + break; + case FORECAST_DATA_END_TIME_FIELD: + forecastDataEndTime = ParseUtils.toInstant(parser); + break; + case CommonName.TASK_ID_FIELD: + taskId = parser.text(); + break; + case HORIZON_INDEX_FIELD: + horizonIndex = parser.intValue(); + break; + default: + parser.skipChildren(); + break; + } + } + + return new ForecastResult( + forecasterId, + taskId, + dataQuality, + featureData, + dataStartTime, + dataEndTime, + executionStartTime, + executionEndTime, + error, + Optional.ofNullable(entity), + user, + schemaVersion, + modelId, + featureId, + forecastValue, + lowerBound, + upperBound, + forecastDataStartTime, + forecastDataEndTime, + horizonIndex + ); + } + + @Generated + @Override + public boolean equals(Object o) { + if (this == o) + return true; + if (o == null || getClass() != o.getClass()) + return false; + if (!super.equals(o)) + return false; + ForecastResult that = (ForecastResult) o; + return Objects.equal(featureId, that.featureId) + && Objects.equal(dataQuality, that.dataQuality) + && Objects.equal(forecastValue, that.forecastValue) + && Objects.equal(lowerBound, that.lowerBound) + && Objects.equal(upperBound, that.upperBound) + && Objects.equal(confidenceIntervalWidth, that.confidenceIntervalWidth) + && Objects.equal(forecastDataStartTime, that.forecastDataStartTime) + && Objects.equal(forecastDataEndTime, that.forecastDataEndTime) + && Objects.equal(horizonIndex, that.horizonIndex); + } + + @Generated + @Override + public int hashCode() { + final int prime = 31; + int result = super.hashCode(); + result = prime * result + Objects + .hashCode( + featureId, + dataQuality, + forecastValue, + lowerBound, + upperBound, + confidenceIntervalWidth, + forecastDataStartTime, + forecastDataEndTime, + horizonIndex + ); + return result; + } + + @Generated + @Override + public String toString() { + return super.toString() + + ", " + + new ToStringBuilder(this) + .append("featureId", featureId) + .append("dataQuality", dataQuality) + .append("forecastValue", forecastValue) + .append("lowerBound", lowerBound) + .append("upperBound", upperBound) + .append("confidenceIntervalWidth", confidenceIntervalWidth) + .append("forecastDataStartTime", forecastDataStartTime) + .append("forecastDataEndTime", forecastDataEndTime) + .append("horizonIndex", horizonIndex) + .toString(); + } + + @Override + public void writeTo(StreamOutput out) throws IOException { + super.writeTo(out); + + out.writeOptionalString(featureId); + out.writeOptionalDouble(dataQuality); + out.writeOptionalFloat(forecastValue); + out.writeOptionalFloat(lowerBound); + out.writeOptionalFloat(upperBound); + out.writeOptionalFloat(confidenceIntervalWidth); + out.writeOptionalInstant(forecastDataStartTime); + out.writeOptionalInstant(forecastDataEndTime); + out.writeOptionalInt(horizonIndex); + } + + public static ForecastResult getDummyResult() { + return new ForecastResult( + DUMMY_FORECASTER_ID, + null, + null, + null, + null, + null, + null, + null, + Optional.empty(), + null, + CommonValue.NO_SCHEMA_VERSION, + null + ); + } + + /** + * Used to throw away requests when index pressure is high. + * @return when the error is there. + */ + @Override + public boolean isHighPriority() { + // AnomalyResult.toXContent won't record Double.NaN and thus make it null + return getError() != null; + } + + public Double getDataQuality() { + return dataQuality; + } + + public String getFeatureId() { + return featureId; + } + + public Float getForecastValue() { + return forecastValue; + } + + public Float getLowerBound() { + return lowerBound; + } + + public Float getUpperBound() { + return upperBound; + } + + public Float getConfidenceIntervalWidth() { + return confidenceIntervalWidth; + } + + public Instant getForecastDataStartTime() { + return forecastDataStartTime; + } + + public Instant getForecastDataEndTime() { + return forecastDataEndTime; + } + + public Integer getHorizonIndex() { + return horizonIndex; + } +} diff --git a/src/main/java/org/opensearch/forecast/model/ForecastTask.java b/src/main/java/org/opensearch/forecast/model/ForecastTask.java new file mode 100644 index 000000000..4d7e889d7 --- /dev/null +++ b/src/main/java/org/opensearch/forecast/model/ForecastTask.java @@ -0,0 +1,389 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.model; + +import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; + +import java.io.IOException; +import java.time.Instant; + +import org.opensearch.commons.authuser.User; +import org.opensearch.core.common.io.stream.StreamInput; +import org.opensearch.core.common.io.stream.StreamOutput; +import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.annotation.Generated; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.TimeSeriesTask; +import org.opensearch.timeseries.util.ParseUtils; + +import com.google.common.base.Objects; + +public class ForecastTask extends TimeSeriesTask { + public static final String FORECASTER_ID_FIELD = "forecaster_id"; + public static final String FORECASTER_FIELD = "forecaster"; + public static final String DATE_RANGE_FIELD = "date_range"; + + private Forecaster forecaster = null; + private DateRange dateRange = null; + + private ForecastTask() {} + + public ForecastTask(StreamInput input) throws IOException { + this.taskId = input.readOptionalString(); + this.taskType = input.readOptionalString(); + this.configId = input.readOptionalString(); + if (input.readBoolean()) { + this.forecaster = new Forecaster(input); + } else { + this.forecaster = null; + } + this.state = input.readOptionalString(); + this.taskProgress = input.readOptionalFloat(); + this.initProgress = input.readOptionalFloat(); + this.currentPiece = input.readOptionalInstant(); + this.executionStartTime = input.readOptionalInstant(); + this.executionEndTime = input.readOptionalInstant(); + this.isLatest = input.readOptionalBoolean(); + this.error = input.readOptionalString(); + this.checkpointId = input.readOptionalString(); + this.lastUpdateTime = input.readOptionalInstant(); + this.startedBy = input.readOptionalString(); + this.stoppedBy = input.readOptionalString(); + this.coordinatingNode = input.readOptionalString(); + this.workerNode = input.readOptionalString(); + if (input.readBoolean()) { + this.user = new User(input); + } else { + user = null; + } + if (input.readBoolean()) { + this.dateRange = new DateRange(input); + } else { + this.dateRange = null; + } + if (input.readBoolean()) { + this.entity = new Entity(input); + } else { + this.entity = null; + } + this.parentTaskId = input.readOptionalString(); + this.estimatedMinutesLeft = input.readOptionalInt(); + } + + @Override + public void writeTo(StreamOutput out) throws IOException { + out.writeOptionalString(taskId); + out.writeOptionalString(taskType); + out.writeOptionalString(configId); + if (forecaster != null) { + out.writeBoolean(true); + forecaster.writeTo(out); + } else { + out.writeBoolean(false); + } + out.writeOptionalString(state); + out.writeOptionalFloat(taskProgress); + out.writeOptionalFloat(initProgress); + out.writeOptionalInstant(currentPiece); + out.writeOptionalInstant(executionStartTime); + out.writeOptionalInstant(executionEndTime); + out.writeOptionalBoolean(isLatest); + out.writeOptionalString(error); + out.writeOptionalString(checkpointId); + out.writeOptionalInstant(lastUpdateTime); + out.writeOptionalString(startedBy); + out.writeOptionalString(stoppedBy); + out.writeOptionalString(coordinatingNode); + out.writeOptionalString(workerNode); + if (user != null) { + out.writeBoolean(true); // user exists + user.writeTo(out); + } else { + out.writeBoolean(false); // user does not exist + } + // Only forward forecast task to nodes with same version, so it's ok to write these new fields. + if (dateRange != null) { + out.writeBoolean(true); + dateRange.writeTo(out); + } else { + out.writeBoolean(false); + } + if (entity != null) { + out.writeBoolean(true); + entity.writeTo(out); + } else { + out.writeBoolean(false); + } + out.writeOptionalString(parentTaskId); + out.writeOptionalInt(estimatedMinutesLeft); + } + + public static Builder builder() { + return new Builder(); + } + + @Override + public boolean isEntityTask() { + return ForecastTaskType.FORECAST_HISTORICAL_HC_ENTITY.name().equals(taskType); + } + + public static class Builder extends TimeSeriesTask.Builder { + private Forecaster forecaster = null; + private DateRange dateRange = null; + + public Builder() {} + + public Builder forecaster(Forecaster forecaster) { + this.forecaster = forecaster; + return this; + } + + public Builder dateRange(DateRange dateRange) { + this.dateRange = dateRange; + return this; + } + + public ForecastTask build() { + ForecastTask forecastTask = new ForecastTask(); + forecastTask.taskId = this.taskId; + forecastTask.lastUpdateTime = this.lastUpdateTime; + forecastTask.error = this.error; + forecastTask.state = this.state; + forecastTask.configId = this.configId; + forecastTask.taskProgress = this.taskProgress; + forecastTask.initProgress = this.initProgress; + forecastTask.currentPiece = this.currentPiece; + forecastTask.executionStartTime = this.executionStartTime; + forecastTask.executionEndTime = this.executionEndTime; + forecastTask.isLatest = this.isLatest; + forecastTask.taskType = this.taskType; + forecastTask.checkpointId = this.checkpointId; + forecastTask.forecaster = this.forecaster; + forecastTask.startedBy = this.startedBy; + forecastTask.stoppedBy = this.stoppedBy; + forecastTask.coordinatingNode = this.coordinatingNode; + forecastTask.workerNode = this.workerNode; + forecastTask.dateRange = this.dateRange; + forecastTask.entity = this.entity; + forecastTask.parentTaskId = this.parentTaskId; + forecastTask.estimatedMinutesLeft = this.estimatedMinutesLeft; + forecastTask.user = this.user; + + return forecastTask; + } + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + XContentBuilder xContentBuilder = builder.startObject(); + xContentBuilder = super.toXContent(xContentBuilder, params); + if (configId != null) { + xContentBuilder.field(FORECASTER_ID_FIELD, configId); + } + if (forecaster != null) { + xContentBuilder.field(FORECASTER_FIELD, forecaster); + } + if (dateRange != null) { + xContentBuilder.field(DATE_RANGE_FIELD, dateRange); + } + return xContentBuilder.endObject(); + } + + public static ForecastTask parse(XContentParser parser) throws IOException { + return parse(parser, null); + } + + public static ForecastTask parse(XContentParser parser, String taskId) throws IOException { + Instant lastUpdateTime = null; + String startedBy = null; + String stoppedBy = null; + String error = null; + String state = null; + String configId = null; + Float taskProgress = null; + Float initProgress = null; + Instant currentPiece = null; + Instant executionStartTime = null; + Instant executionEndTime = null; + Boolean isLatest = null; + String taskType = null; + String checkpointId = null; + Forecaster forecaster = null; + String parsedTaskId = taskId; + String coordinatingNode = null; + String workerNode = null; + DateRange dateRange = null; + Entity entity = null; + String parentTaskId = null; + Integer estimatedMinutesLeft = null; + User user = null; + + ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.currentToken(), parser); + while (parser.nextToken() != XContentParser.Token.END_OBJECT) { + String fieldName = parser.currentName(); + parser.nextToken(); + + switch (fieldName) { + case LAST_UPDATE_TIME_FIELD: + lastUpdateTime = ParseUtils.toInstant(parser); + break; + case STARTED_BY_FIELD: + startedBy = parser.text(); + break; + case STOPPED_BY_FIELD: + stoppedBy = parser.text(); + break; + case ERROR_FIELD: + error = parser.text(); + break; + case STATE_FIELD: + state = parser.text(); + break; + case FORECASTER_ID_FIELD: + configId = parser.text(); + break; + case TASK_PROGRESS_FIELD: + taskProgress = parser.floatValue(); + break; + case INIT_PROGRESS_FIELD: + initProgress = parser.floatValue(); + break; + case CURRENT_PIECE_FIELD: + currentPiece = ParseUtils.toInstant(parser); + break; + case EXECUTION_START_TIME_FIELD: + executionStartTime = ParseUtils.toInstant(parser); + break; + case EXECUTION_END_TIME_FIELD: + executionEndTime = ParseUtils.toInstant(parser); + break; + case IS_LATEST_FIELD: + isLatest = parser.booleanValue(); + break; + case TASK_TYPE_FIELD: + taskType = parser.text(); + break; + case CHECKPOINT_ID_FIELD: + checkpointId = parser.text(); + break; + case FORECASTER_FIELD: + forecaster = Forecaster.parse(parser); + break; + case TASK_ID_FIELD: + parsedTaskId = parser.text(); + break; + case COORDINATING_NODE_FIELD: + coordinatingNode = parser.text(); + break; + case WORKER_NODE_FIELD: + workerNode = parser.text(); + break; + case DATE_RANGE_FIELD: + dateRange = DateRange.parse(parser); + break; + case ENTITY_FIELD: + entity = Entity.parse(parser); + break; + case PARENT_TASK_ID_FIELD: + parentTaskId = parser.text(); + break; + case ESTIMATED_MINUTES_LEFT_FIELD: + estimatedMinutesLeft = parser.intValue(); + break; + case USER_FIELD: + user = User.parse(parser); + break; + default: + parser.skipChildren(); + break; + } + } + Forecaster copyForecaster = forecaster == null + ? null + : new Forecaster( + configId, + forecaster.getVersion(), + forecaster.getName(), + forecaster.getDescription(), + forecaster.getTimeField(), + forecaster.getIndices(), + forecaster.getFeatureAttributes(), + forecaster.getFilterQuery(), + forecaster.getInterval(), + forecaster.getWindowDelay(), + forecaster.getShingleSize(), + forecaster.getUiMetadata(), + forecaster.getSchemaVersion(), + forecaster.getLastUpdateTime(), + forecaster.getCategoryFields(), + forecaster.getUser(), + forecaster.getCustomResultIndex(), + forecaster.getHorizon(), + forecaster.getImputationOption() + ); + return new Builder() + .taskId(parsedTaskId) + .lastUpdateTime(lastUpdateTime) + .startedBy(startedBy) + .stoppedBy(stoppedBy) + .error(error) + .state(state) + .configId(configId) + .taskProgress(taskProgress) + .initProgress(initProgress) + .currentPiece(currentPiece) + .executionStartTime(executionStartTime) + .executionEndTime(executionEndTime) + .isLatest(isLatest) + .taskType(taskType) + .checkpointId(checkpointId) + .coordinatingNode(coordinatingNode) + .workerNode(workerNode) + .forecaster(copyForecaster) + .dateRange(dateRange) + .entity(entity) + .parentTaskId(parentTaskId) + .estimatedMinutesLeft(estimatedMinutesLeft) + .user(user) + .build(); + } + + @Generated + @Override + public boolean equals(Object other) { + if (this == other) + return true; + if (other == null || getClass() != other.getClass()) + return false; + ForecastTask that = (ForecastTask) other; + return super.equals(that) + && Objects.equal(getForecaster(), that.getForecaster()) + && Objects.equal(getDateRange(), that.getDateRange()); + } + + @Generated + @Override + public int hashCode() { + int superHashCode = super.hashCode(); + int hash = Objects.hashCode(configId, forecaster, dateRange); + hash += 89 * superHashCode; + return hash; + } + + public Forecaster getForecaster() { + return forecaster; + } + + public DateRange getDateRange() { + return dateRange; + } + + public void setDateRange(DateRange dateRange) { + this.dateRange = dateRange; + } +} diff --git a/src/main/java/org/opensearch/forecast/model/ForecastTaskType.java b/src/main/java/org/opensearch/forecast/model/ForecastTaskType.java new file mode 100644 index 000000000..76e1aac88 --- /dev/null +++ b/src/main/java/org/opensearch/forecast/model/ForecastTaskType.java @@ -0,0 +1,69 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.forecast.model; + +import java.util.List; + +import org.opensearch.timeseries.model.TaskType; + +import com.google.common.collect.ImmutableList; + +/** + * The ForecastTaskType enum defines different task types for forecasting, categorized into real-time and historical settings. + * In real-time forecasting, we monitor states at the forecaster level, resulting in two distinct task types: one for + * single-stream forecasting and another for high cardinality (HC). In the historical setting, state tracking is more nuanced, + * encompassing both entity and forecaster levels. This leads to three specific task types: a forecaster-level task dedicated + * to single-stream forecasting, and two tasks for HC, one at the forecaster level and another at the entity level. + * + * Real-time forecasting: + * - FORECAST_REALTIME_SINGLE_STREAM: Represents a task type for single-stream forecasting. Ideal for scenarios where a single + * time series is processed in real-time. + * - FORECAST_REALTIME_HC_FORECASTER: Represents a task type for high cardinality (HC) forecasting. Used when dealing with a + * large number of distinct entities in real-time. + * + * Historical forecasting: + * - FORECAST_HISTORICAL_SINGLE_STREAM: Represents a forecaster-level task for single-stream historical forecasting. + * Suitable for analyzing a single time series in a sequential manner. + * - FORECAST_HISTORICAL_HC_FORECASTER: A forecaster-level task to track overall state, initialization progress, errors, etc., + * for HC forecasting. Central to managing multiple historical time series with high cardinality. + * - FORECAST_HISTORICAL_HC_ENTITY: An entity-level task to track the state, initialization progress, errors, etc., of a + * specific entity within HC historical forecasting. Allows for fine-grained information recording at the entity level. + * + */ +public enum ForecastTaskType implements TaskType { + FORECAST_REALTIME_SINGLE_STREAM, + FORECAST_REALTIME_HC_FORECASTER, + FORECAST_HISTORICAL_SINGLE_STREAM, + // forecaster level task to track overall state, init progress, error etc. for HC forecaster + FORECAST_HISTORICAL_HC_FORECASTER, + // entity level task to track just one specific entity's state, init progress, error etc. + FORECAST_HISTORICAL_HC_ENTITY; + + public static List HISTORICAL_FORECASTER_TASK_TYPES = ImmutableList + .of(ForecastTaskType.FORECAST_HISTORICAL_HC_FORECASTER, ForecastTaskType.FORECAST_HISTORICAL_SINGLE_STREAM); + public static List ALL_HISTORICAL_TASK_TYPES = ImmutableList + .of( + ForecastTaskType.FORECAST_HISTORICAL_HC_FORECASTER, + ForecastTaskType.FORECAST_HISTORICAL_SINGLE_STREAM, + ForecastTaskType.FORECAST_HISTORICAL_HC_ENTITY + ); + public static List REALTIME_TASK_TYPES = ImmutableList + .of(ForecastTaskType.FORECAST_REALTIME_SINGLE_STREAM, ForecastTaskType.FORECAST_REALTIME_HC_FORECASTER); + public static List ALL_FORECAST_TASK_TYPES = ImmutableList + .of( + ForecastTaskType.FORECAST_REALTIME_SINGLE_STREAM, + ForecastTaskType.FORECAST_REALTIME_HC_FORECASTER, + ForecastTaskType.FORECAST_HISTORICAL_SINGLE_STREAM, + ForecastTaskType.FORECAST_HISTORICAL_HC_FORECASTER, + ForecastTaskType.FORECAST_HISTORICAL_HC_ENTITY + ); +} diff --git a/src/main/java/org/opensearch/forecast/model/Forecaster.java b/src/main/java/org/opensearch/forecast/model/Forecaster.java new file mode 100644 index 000000000..c572c28db --- /dev/null +++ b/src/main/java/org/opensearch/forecast/model/Forecaster.java @@ -0,0 +1,405 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.model; + +import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.forecast.constant.ForecastCommonName.CUSTOM_RESULT_INDEX_PREFIX; +import static org.opensearch.index.query.AbstractQueryBuilder.parseInnerQueryBuilder; + +import java.io.IOException; +import java.time.Instant; +import java.time.temporal.ChronoUnit; +import java.util.ArrayList; +import java.util.List; +import java.util.Map; + +import org.opensearch.common.unit.TimeValue; +import org.opensearch.commons.authuser.User; +import org.opensearch.core.ParseField; +import org.opensearch.core.common.ParsingException; +import org.opensearch.core.common.io.stream.StreamInput; +import org.opensearch.core.common.io.stream.StreamOutput; +import org.opensearch.core.xcontent.NamedXContentRegistry; +import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.core.xcontent.XContentParseException; +import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.forecast.constant.ForecastCommonMessages; +import org.opensearch.forecast.settings.ForecastNumericSetting; +import org.opensearch.index.query.QueryBuilder; +import org.opensearch.index.query.QueryBuilders; +import org.opensearch.timeseries.common.exception.ValidationException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.constant.CommonValue; +import org.opensearch.timeseries.dataprocessor.ImputationOption; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.TimeConfiguration; +import org.opensearch.timeseries.model.ValidationAspect; +import org.opensearch.timeseries.model.ValidationIssueType; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.ParseUtils; + +import com.google.common.base.Objects; + +/** + * Similar to AnomalyDetector, Forecaster defines config object. We cannot inherit from + * AnomalyDetector as AnomalyDetector uses detection interval but Forecaster doesn't + * need it and has to set it to null. Detection interval being null would fail + * AnomalyDetector's constructor because detection interval cannot be null. + */ +public class Forecaster extends Config { + public static final String FORECAST_PARSE_FIELD_NAME = "Forecaster"; + public static final NamedXContentRegistry.Entry XCONTENT_REGISTRY = new NamedXContentRegistry.Entry( + Forecaster.class, + new ParseField(FORECAST_PARSE_FIELD_NAME), + it -> parse(it) + ); + + public static final String HORIZON_FIELD = "horizon"; + public static final String FORECAST_INTERVAL_FIELD = "forecast_interval"; + public static final int DEFAULT_HORIZON_SHINGLE_RATIO = 3; + + private Integer horizon; + + public Forecaster( + String forecasterId, + Long version, + String name, + String description, + String timeField, + List indices, + List features, + QueryBuilder filterQuery, + TimeConfiguration forecastInterval, + TimeConfiguration windowDelay, + Integer shingleSize, + Map uiMetadata, + Integer schemaVersion, + Instant lastUpdateTime, + List categoryFields, + User user, + String resultIndex, + Integer horizon, + ImputationOption imputationOption + ) { + super( + forecasterId, + version, + name, + description, + timeField, + indices, + features, + filterQuery, + windowDelay, + shingleSize, + uiMetadata, + schemaVersion, + lastUpdateTime, + categoryFields, + user, + resultIndex, + forecastInterval, + imputationOption + ); + + checkAndThrowValidationErrors(ValidationAspect.FORECASTER); + + if (forecastInterval == null) { + errorMessage = ForecastCommonMessages.NULL_FORECAST_INTERVAL; + issueType = ValidationIssueType.FORECAST_INTERVAL; + } else if (((IntervalTimeConfiguration) forecastInterval).getInterval() <= 0) { + errorMessage = ForecastCommonMessages.INVALID_FORECAST_INTERVAL; + issueType = ValidationIssueType.FORECAST_INTERVAL; + } + + int maxCategoryFields = ForecastNumericSetting.maxCategoricalFields(); + if (categoryFields != null && categoryFields.size() > maxCategoryFields) { + errorMessage = CommonMessages.getTooManyCategoricalFieldErr(maxCategoryFields); + issueType = ValidationIssueType.CATEGORY; + } + + if (invalidHorizon(horizon)) { + errorMessage = "Horizon size must be a positive integer no larger than " + + TimeSeriesSettings.MAX_SHINGLE_SIZE * DEFAULT_HORIZON_SHINGLE_RATIO + + ". Got " + + horizon; + issueType = ValidationIssueType.SHINGLE_SIZE_FIELD; + } + + checkAndThrowValidationErrors(ValidationAspect.FORECASTER); + + this.horizon = horizon; + } + + public Forecaster(StreamInput input) throws IOException { + super(input); + horizon = input.readInt(); + } + + @Override + public void writeTo(StreamOutput output) throws IOException { + super.writeTo(output); + output.writeInt(horizon); + } + + public boolean invalidHorizon(Integer horizonToTest) { + return horizonToTest != null + && (horizonToTest < 1 || horizonToTest > TimeSeriesSettings.MAX_SHINGLE_SIZE * DEFAULT_HORIZON_SHINGLE_RATIO); + } + + /** + * Parse raw json content into forecaster instance. + * + * @param parser json based content parser + * @return forecaster instance + * @throws IOException IOException if content can't be parsed correctly + */ + public static Forecaster parse(XContentParser parser) throws IOException { + return parse(parser, null); + } + + public static Forecaster parse(XContentParser parser, String forecasterId) throws IOException { + return parse(parser, forecasterId, null); + } + + /** + * Parse raw json content and given forecaster id into forecaster instance. + * + * @param parser json based content parser + * @param forecasterId forecaster id + * @param version forecaster document version + * @return forecaster instance + * @throws IOException IOException if content can't be parsed correctly + */ + public static Forecaster parse(XContentParser parser, String forecasterId, Long version) throws IOException { + return parse(parser, forecasterId, version, null, null); + } + + /** + * Parse raw json content and given forecaster id into forecaster instance. + * + * @param parser json based content parser + * @param forecasterId forecaster id + * @param version forecast document version + * @param defaultForecastInterval default forecaster interval + * @param defaultForecastWindowDelay default forecaster window delay + * @return forecaster instance + * @throws IOException IOException if content can't be parsed correctly + */ + public static Forecaster parse( + XContentParser parser, + String forecasterId, + Long version, + TimeValue defaultForecastInterval, + TimeValue defaultForecastWindowDelay + ) throws IOException { + String name = null; + String description = ""; + String timeField = null; + List indices = new ArrayList(); + QueryBuilder filterQuery = QueryBuilders.matchAllQuery(); + TimeConfiguration forecastInterval = defaultForecastInterval == null + ? null + : new IntervalTimeConfiguration(defaultForecastInterval.getMinutes(), ChronoUnit.MINUTES); + TimeConfiguration windowDelay = defaultForecastWindowDelay == null + ? null + : new IntervalTimeConfiguration(defaultForecastWindowDelay.getSeconds(), ChronoUnit.SECONDS); + Integer shingleSize = null; + List features = new ArrayList<>(); + Integer schemaVersion = CommonValue.NO_SCHEMA_VERSION; + Map uiMetadata = null; + Instant lastUpdateTime = null; + User user = null; + String resultIndex = null; + + List categoryField = null; + Integer horizon = null; + ImputationOption interpolationOption = null; + + ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.currentToken(), parser); + while (parser.nextToken() != XContentParser.Token.END_OBJECT) { + String fieldName = parser.currentName(); + parser.nextToken(); + + switch (fieldName) { + case NAME_FIELD: + name = parser.text(); + break; + case DESCRIPTION_FIELD: + description = parser.text(); + break; + case TIMEFIELD_FIELD: + timeField = parser.text(); + break; + case INDICES_FIELD: + ensureExpectedToken(XContentParser.Token.START_ARRAY, parser.currentToken(), parser); + while (parser.nextToken() != XContentParser.Token.END_ARRAY) { + indices.add(parser.text()); + } + break; + case UI_METADATA_FIELD: + uiMetadata = parser.map(); + break; + case CommonName.SCHEMA_VERSION_FIELD: + schemaVersion = parser.intValue(); + break; + case FILTER_QUERY_FIELD: + ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.currentToken(), parser); + try { + filterQuery = parseInnerQueryBuilder(parser); + } catch (ParsingException | XContentParseException e) { + throw new ValidationException( + "Custom query error in data filter: " + e.getMessage(), + ValidationIssueType.FILTER_QUERY, + ValidationAspect.FORECASTER + ); + } catch (IllegalArgumentException e) { + if (!e.getMessage().contains("empty clause")) { + throw e; + } + } + break; + case FORECAST_INTERVAL_FIELD: + try { + forecastInterval = TimeConfiguration.parse(parser); + } catch (Exception e) { + if (e instanceof IllegalArgumentException && e.getMessage().contains(CommonMessages.NEGATIVE_TIME_CONFIGURATION)) { + throw new ValidationException( + "Forecasting interval must be a positive integer", + ValidationIssueType.FORECAST_INTERVAL, + ValidationAspect.FORECASTER + ); + } + throw e; + } + break; + case FEATURE_ATTRIBUTES_FIELD: + try { + ensureExpectedToken(XContentParser.Token.START_ARRAY, parser.currentToken(), parser); + while (parser.nextToken() != XContentParser.Token.END_ARRAY) { + features.add(Feature.parse(parser)); + } + } catch (Exception e) { + if (e instanceof ParsingException || e instanceof XContentParseException) { + throw new ValidationException( + "Custom query error: " + e.getMessage(), + ValidationIssueType.FEATURE_ATTRIBUTES, + ValidationAspect.FORECASTER + ); + } + throw e; + } + break; + case WINDOW_DELAY_FIELD: + try { + windowDelay = TimeConfiguration.parse(parser); + } catch (Exception e) { + if (e instanceof IllegalArgumentException && e.getMessage().contains(CommonMessages.NEGATIVE_TIME_CONFIGURATION)) { + throw new ValidationException( + "Window delay interval must be a positive integer", + ValidationIssueType.WINDOW_DELAY, + ValidationAspect.FORECASTER + ); + } + throw e; + } + break; + case SHINGLE_SIZE_FIELD: + shingleSize = parser.intValue(); + break; + case LAST_UPDATE_TIME_FIELD: + lastUpdateTime = ParseUtils.toInstant(parser); + break; + case CATEGORY_FIELD: + categoryField = (List) parser.list(); + break; + case USER_FIELD: + user = User.parse(parser); + break; + case RESULT_INDEX_FIELD: + resultIndex = parser.text(); + break; + case HORIZON_FIELD: + horizon = parser.intValue(); + break; + case IMPUTATION_OPTION_FIELD: + interpolationOption = ImputationOption.parse(parser); + break; + default: + parser.skipChildren(); + break; + } + } + Forecaster forecaster = new Forecaster( + forecasterId, + version, + name, + description, + timeField, + indices, + features, + filterQuery, + forecastInterval, + windowDelay, + getShingleSize(shingleSize), + uiMetadata, + schemaVersion, + lastUpdateTime, + categoryField, + user, + resultIndex, + horizon, + interpolationOption + ); + return forecaster; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + XContentBuilder xContentBuilder = builder.startObject(); + xContentBuilder = super.toXContent(xContentBuilder, params); + xContentBuilder.field(FORECAST_INTERVAL_FIELD, interval).field(HORIZON_FIELD, horizon); + + return xContentBuilder.endObject(); + } + + @Override + public boolean equals(Object o) { + if (this == o) + return true; + if (o == null || getClass() != o.getClass()) + return false; + Forecaster forecaster = (Forecaster) o; + return super.equals(o) && Objects.equal(horizon, forecaster.horizon); + } + + @Override + public int hashCode() { + int hash = super.hashCode(); + hash = 89 * hash + (this.horizon != null ? this.horizon.hashCode() : 0); + return hash; + } + + @Override + public String validateCustomResultIndex(String resultIndex) { + if (resultIndex != null && !resultIndex.startsWith(CUSTOM_RESULT_INDEX_PREFIX)) { + return ForecastCommonMessages.INVALID_RESULT_INDEX_PREFIX; + } + return super.validateCustomResultIndex(resultIndex); + } + + @Override + protected ValidationAspect getConfigValidationAspect() { + return ValidationAspect.FORECASTER; + } + + public Integer getHorizon() { + return horizon; + } +} diff --git a/src/main/java/org/opensearch/forecast/settings/ForecastEnabledSetting.java b/src/main/java/org/opensearch/forecast/settings/ForecastEnabledSetting.java new file mode 100644 index 000000000..1db9bf340 --- /dev/null +++ b/src/main/java/org/opensearch/forecast/settings/ForecastEnabledSetting.java @@ -0,0 +1,92 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.settings; + +import static java.util.Collections.unmodifiableMap; +import static org.opensearch.common.settings.Setting.Property.Dynamic; +import static org.opensearch.common.settings.Setting.Property.NodeScope; + +import java.util.HashMap; +import java.util.Map; + +import org.opensearch.common.settings.Setting; +import org.opensearch.timeseries.settings.DynamicNumericSetting; + +public class ForecastEnabledSetting extends DynamicNumericSetting { + + /** + * Singleton instance + */ + private static ForecastEnabledSetting INSTANCE; + + /** + * Settings name + */ + public static final String FORECAST_ENABLED = "plugins.forecast.enabled"; + + public static final String FORECAST_BREAKER_ENABLED = "plugins.forecast.breaker.enabled"; + + public static final String FORECAST_DOOR_KEEPER_IN_CACHE_ENABLED = "plugins.forecast.door_keeper_in_cache.enabled";; + + public static final Map> settings = unmodifiableMap(new HashMap>() { + { + /** + * forecast enable/disable setting + */ + put(FORECAST_ENABLED, Setting.boolSetting(FORECAST_ENABLED, true, NodeScope, Dynamic)); + + /** + * forecast breaker enable/disable setting + */ + put(FORECAST_BREAKER_ENABLED, Setting.boolSetting(FORECAST_BREAKER_ENABLED, true, NodeScope, Dynamic)); + + /** + * We have a bloom filter placed in front of inactive entity cache to + * filter out unpopular items that are not likely to appear more + * than once. Whether this bloom filter is enabled or not. + */ + put( + FORECAST_DOOR_KEEPER_IN_CACHE_ENABLED, + Setting.boolSetting(FORECAST_DOOR_KEEPER_IN_CACHE_ENABLED, false, NodeScope, Dynamic) + ); + } + }); + + private ForecastEnabledSetting(Map> settings) { + super(settings); + } + + public static synchronized ForecastEnabledSetting getInstance() { + if (INSTANCE == null) { + INSTANCE = new ForecastEnabledSetting(settings); + } + return INSTANCE; + } + + /** + * Whether forecasting is enabled. If disabled, time series plugin rejects RESTful requests about forecasting and stop all forecasting jobs. + * @return whether forecasting is enabled. + */ + public static boolean isForecastEnabled() { + return ForecastEnabledSetting.getInstance().getSettingValue(ForecastEnabledSetting.FORECAST_ENABLED); + } + + /** + * Whether forecast circuit breaker is enabled or not. If disabled, an open circuit breaker wouldn't cause an forecast job to be stopped. + * @return whether forecast circuit breaker is enabled or not. + */ + public static boolean isForecastBreakerEnabled() { + return ForecastEnabledSetting.getInstance().getSettingValue(ForecastEnabledSetting.FORECAST_BREAKER_ENABLED); + } + + /** + * If enabled, we filter out unpopular items that are not likely to appear more than once + * @return wWhether door keeper in cache is enabled or not. + */ + public static boolean isDoorKeeperInCacheEnabled() { + return ForecastEnabledSetting.getInstance().getSettingValue(ForecastEnabledSetting.FORECAST_DOOR_KEEPER_IN_CACHE_ENABLED); + } +} diff --git a/src/main/java/org/opensearch/forecast/settings/ForecastNumericSetting.java b/src/main/java/org/opensearch/forecast/settings/ForecastNumericSetting.java new file mode 100644 index 000000000..271321575 --- /dev/null +++ b/src/main/java/org/opensearch/forecast/settings/ForecastNumericSetting.java @@ -0,0 +1,59 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.settings; + +import static java.util.Collections.unmodifiableMap; + +import java.util.HashMap; +import java.util.Map; + +import org.opensearch.common.settings.Setting; +import org.opensearch.timeseries.settings.DynamicNumericSetting; + +public class ForecastNumericSetting extends DynamicNumericSetting { + /** + * Singleton instance + */ + private static ForecastNumericSetting INSTANCE; + + /** + * Settings name + */ + public static final String CATEGORY_FIELD_LIMIT = "plugins.forecast.category_field_limit"; + + private static final Map> settings = unmodifiableMap(new HashMap>() { + { + // how many categorical fields we support + // The number of category field won't causes correctness issues for our + // implementation, but can cause performance issues. The more categorical + // fields, the larger of the forecast results, intermediate states, and + // more expensive queries (e.g., to get top entities in preview API, we need + // to use scripts in terms aggregation. The more fields, the slower the query). + put( + CATEGORY_FIELD_LIMIT, + Setting.intSetting(CATEGORY_FIELD_LIMIT, 2, 0, 5, Setting.Property.NodeScope, Setting.Property.Dynamic) + ); + } + }); + + ForecastNumericSetting(Map> settings) { + super(settings); + } + + public static synchronized ForecastNumericSetting getInstance() { + if (INSTANCE == null) { + INSTANCE = new ForecastNumericSetting(settings); + } + return INSTANCE; + } + + /** + * @return the max number of categorical fields + */ + public static int maxCategoricalFields() { + return ForecastNumericSetting.getInstance().getSettingValue(ForecastNumericSetting.CATEGORY_FIELD_LIMIT); + } +} diff --git a/src/main/java/org/opensearch/forecast/settings/ForecastSettings.java b/src/main/java/org/opensearch/forecast/settings/ForecastSettings.java new file mode 100644 index 000000000..8aeaeb6c3 --- /dev/null +++ b/src/main/java/org/opensearch/forecast/settings/ForecastSettings.java @@ -0,0 +1,389 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.settings; + +import org.opensearch.common.settings.Setting; +import org.opensearch.common.unit.TimeValue; + +public final class ForecastSettings { + // ====================================== + // config parameters + // ====================================== + public static final Setting FORECAST_INTERVAL = Setting + .positiveTimeSetting( + "plugins.forecast.default_interval", + TimeValue.timeValueMinutes(10), + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + public static final Setting FORECAST_WINDOW_DELAY = Setting + .timeSetting( + "plugins.forecast.default_window_delay", + TimeValue.timeValueMinutes(0), + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + // ====================================== + // restful apis + // ====================================== + public static final Setting FORECAST_REQUEST_TIMEOUT = Setting + .positiveTimeSetting( + "plugins.forecast.request_timeout", + TimeValue.timeValueSeconds(10), + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + // ====================================== + // cleanup resouce setting + // ====================================== + public static final Setting DELETE_FORECAST_RESULT_WHEN_DELETE_FORECASTER = Setting + .boolSetting( + "plugins.forecast.delete_forecast_result_when_delete_forecaster", + false, + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + // ====================================== + // resource constraint + // ====================================== + public static final Setting MAX_SINGLE_STREAM_FORECASTERS = Setting + .intSetting("plugins.forecast.max_forecasters", 1000, 0, 10_000, Setting.Property.NodeScope, Setting.Property.Dynamic); + + public static final Setting MAX_HC_FORECASTERS = Setting + .intSetting("plugins.forecast.max_hc_forecasters", 10, 0, 10_000, Setting.Property.NodeScope, Setting.Property.Dynamic); + + // save partial zero-anomaly grade results after indexing pressure reaching the limit + // Opendistro version has similar setting. I lowered the value to make room + // for INDEX_PRESSURE_HARD_LIMIT. I don't find a floatSetting that has both default + // and fallback values. I want users to use the new default value 0.6 instead of 0.8. + // So do not plan to use the value of legacy setting as fallback. + public static final Setting FORECAST_INDEX_PRESSURE_SOFT_LIMIT = Setting + .floatSetting("plugins.forecast.index_pressure_soft_limit", 0.6f, 0.0f, Setting.Property.NodeScope, Setting.Property.Dynamic); + + // save only error or larger-than-one anomaly grade results after indexing + // pressure reaching the limit + // opensearch-only setting + public static final Setting FORECAST_INDEX_PRESSURE_HARD_LIMIT = Setting + .floatSetting("plugins.forecast.index_pressure_hard_limit", 0.9f, 0.0f, Setting.Property.NodeScope, Setting.Property.Dynamic); + + // we only allow single feature forecast now + public static final int MAX_FORECAST_FEATURES = 1; + + // ====================================== + // AD Index setting + // ====================================== + public static int FORECAST_MAX_UPDATE_RETRY_TIMES = 10_000; + + // ====================================== + // Indices + // ====================================== + public static final Setting FORECAST_RESULT_HISTORY_MAX_DOCS_PER_SHARD = Setting + .longSetting( + "plugins.forecast.forecast_result_history_max_docs_per_shard", + // Total documents in the primary shards. + // Note the count is for Lucene docs. Lucene considers a nested + // doc a doc too. One result on average equals to 4 Lucene docs. + // A single Lucene doc is roughly 46.8 bytes (measured by experiments). + // 1.35 billion docs is about 65 GB. One shard can have at most 65 GB. + // This number in Lucene doc count is used in RolloverRequest#addMaxIndexDocsCondition + // for adding condition to check if the index has at least numDocs. + 1_350_000_000L, + 0L, + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + public static final Setting FORECAST_RESULT_HISTORY_RETENTION_PERIOD = Setting + .positiveTimeSetting( + "plugins.forecast.forecast_result_history_retention_period", + TimeValue.timeValueDays(30), + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + public static final Setting FORECAST_RESULT_HISTORY_ROLLOVER_PERIOD = Setting + .positiveTimeSetting( + "plugins.forecast.forecast_result_history_rollover_period", + TimeValue.timeValueHours(12), + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + public static final String FORECAST_RESULTS_INDEX_MAPPING_FILE = "mappings/forecast-results.json"; + public static final String FORECAST_STATE_INDEX_MAPPING_FILE = "mappings/forecast-state.json"; + public static final String FORECAST_CHECKPOINT_INDEX_MAPPING_FILE = "mappings/forecast-checkpoint.json"; + + // max number of primary shards of a forecast index + public static final Setting FORECAST_MAX_PRIMARY_SHARDS = Setting + .intSetting("plugins.forecast.max_primary_shards", 10, 0, 200, Setting.Property.NodeScope, Setting.Property.Dynamic); + + // saving checkpoint every 12 hours. + // To support 1 million entities in 36 data nodes, each node has roughly 28K models. + // In each hour, we roughly need to save 2400 models. Since each model saving can + // take about 1 seconds (default value of FORECAST_EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS) + // we can use up to 2400 seconds to finish saving checkpoints. + public static final Setting FORECAST_CHECKPOINT_SAVING_FREQ = Setting + .positiveTimeSetting( + "plugins.forecast.checkpoint_saving_freq", + TimeValue.timeValueHours(12), + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + public static final Setting FORECAST_CHECKPOINT_TTL = Setting + .positiveTimeSetting( + "plugins.forecast.checkpoint_ttl", + TimeValue.timeValueDays(7), + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + // ====================================== + // Security + // ====================================== + public static final Setting FORECAST_FILTER_BY_BACKEND_ROLES = Setting + .boolSetting("plugins.forecast.filter_by_backend_roles", false, Setting.Property.NodeScope, Setting.Property.Dynamic); + + // ====================================== + // Task + // ====================================== + public static int MAX_OLD_FORECAST_TASK_DOCS = 1000; + + public static final Setting MAX_OLD_TASK_DOCS_PER_FORECASTER = Setting + .intSetting( + "plugins.forecast.max_old_task_docs_per_forecaster", + // One forecast task is roughly 1.5KB for normal case. Suppose task's size + // is 2KB conservatively. If we store 1000 forecast tasks for one forecaster, + // that will be 2GB. + 1, + 1, // keep at least 1 old task per forecaster + MAX_OLD_FORECAST_TASK_DOCS, + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + // Maximum number of deleted tasks can keep in cache. + public static final Setting MAX_CACHED_DELETED_TASKS = Setting + .intSetting("plugins.forecast.max_cached_deleted_tasks", 1000, 1, 10_000, Setting.Property.NodeScope, Setting.Property.Dynamic); + + // ====================================== + // rate-limiting queue parameters + // ====================================== + /** + * ES recommends bulk size to be 5~15 MB. + * ref: https://tinyurl.com/3zdbmbwy + * Assume each checkpoint takes roughly 200KB. 25 requests are of 5 MB. + */ + public static final Setting FORECAST_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE = Setting + .intSetting("plugins.forecast.checkpoint_write_queue_batch_size", 25, 1, 60, Setting.Property.NodeScope, Setting.Property.Dynamic); + + // expected execution time per checkpoint maintain request. This setting controls + // the speed of checkpoint maintenance execution. The larger, the faster, and + // the more performance impact to customers' workload. + public static final Setting FORECAST_EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS = Setting + .intSetting( + "plugins.forecast.expected_checkpoint_maintain_time_in_millisecs", + 1000, + 0, + 3600000, + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + /** + * Max concurrent checkpoint writes per node + */ + public static final Setting FORECAST_CHECKPOINT_WRITE_QUEUE_CONCURRENCY = Setting + .intSetting("plugins.forecast.checkpoint_write_queue_concurrency", 2, 1, 10, Setting.Property.NodeScope, Setting.Property.Dynamic); + + /** + * Max concurrent cold starts per node + */ + public static final Setting FORECAST_COLD_START_QUEUE_CONCURRENCY = Setting + .intSetting("plugins.forecast.cold_start_queue_concurrency", 1, 1, 10, Setting.Property.NodeScope, Setting.Property.Dynamic); + + /** + * Max concurrent result writes per node. Since checkpoint is relatively large + * (250KB), we have 2 concurrent threads processing the queue. + */ + public static final Setting FORECAST_RESULT_WRITE_QUEUE_CONCURRENCY = Setting + .intSetting("plugins.forecast.result_write_queue_concurrency", 2, 1, 10, Setting.Property.NodeScope, Setting.Property.Dynamic); + + /** + * ES recommends bulk size to be 5~15 MB. + * ref: https://tinyurl.com/3zdbmbwy + * Assume each result takes roughly 1KB. 5000 requests are of 5 MB. + */ + public static final Setting FORECAST_RESULT_WRITE_QUEUE_BATCH_SIZE = Setting + .intSetting("plugins.forecast.result_write_queue_batch_size", 5000, 1, 15000, Setting.Property.NodeScope, Setting.Property.Dynamic); + + /** + * Max concurrent checkpoint reads per node + */ + public static final Setting FORECAST_CHECKPOINT_READ_QUEUE_CONCURRENCY = Setting + .intSetting("plugins.forecast.checkpoint_read_queue_concurrency", 1, 1, 10, Setting.Property.NodeScope, Setting.Property.Dynamic); + + /** + * Assume each checkpoint takes roughly 200KB. 25 requests are of 5 MB. + */ + public static final Setting FORECAST_CHECKPOINT_READ_QUEUE_BATCH_SIZE = Setting + .intSetting("plugins.forecast.checkpoint_read_queue_batch_size", 25, 1, 60, Setting.Property.NodeScope, Setting.Property.Dynamic); + + // expected execution time per cold entity request. This setting controls + // the speed of cold entity requests execution. The larger, the faster, and + // the more performance impact to customers' workload. + public static final Setting FORECAST_EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS = Setting + .intSetting( + "plugins.forecast.expected_cold_entity_execution_time_in_millisecs", + 3000, + 0, + 3600000, + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + // the percentage of heap usage allowed for queues holding large requests + // set it to 0 to disable the queue + public static final Setting FORECAST_CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT = Setting + .floatSetting( + "plugins.forecast.checkpoint_write_queue_max_heap_percent", + 0.01f, + 0.0f, + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + public static final Setting FORECAST_CHECKPOINT_MAINTAIN_QUEUE_MAX_HEAP_PERCENT = Setting + .floatSetting( + "plugins.forecast.checkpoint_maintain_queue_max_heap_percent", + 0.001f, + 0.0f, + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + public static final Setting FORECAST_COLD_START_QUEUE_MAX_HEAP_PERCENT = Setting + .floatSetting( + "plugins.forecast.cold_start_queue_max_heap_percent", + 0.001f, + 0.0f, + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + public static final Setting FORECAST_RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT = Setting + .floatSetting( + "plugins.forecast.result_write_queue_max_heap_percent", + 0.01f, + 0.0f, + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + public static final Setting FORECAST_CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT = Setting + .floatSetting( + "plugins.forecast.checkpoint_read_queue_max_heap_percent", + 0.001f, + 0.0f, + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + public static final Setting FORECAST_COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT = Setting + .floatSetting( + "plugins.forecast.cold_entity_queue_max_heap_percent", + 0.001f, + 0.0f, + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + // ====================================== + // fault tolerance + // ====================================== + public static final Setting FORECAST_BACKOFF_INITIAL_DELAY = Setting + .positiveTimeSetting( + "plugins.forecast.backoff_initial_delay", + TimeValue.timeValueMillis(1000), + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + public static final Setting FORECAST_MAX_RETRY_FOR_BACKOFF = Setting + .intSetting("plugins.forecast.max_retry_for_backoff", 3, 0, Setting.Property.NodeScope, Setting.Property.Dynamic); + + public static final Setting FORECAST_BACKOFF_MINUTES = Setting + .positiveTimeSetting( + "plugins.forecast.backoff_minutes", + TimeValue.timeValueMinutes(15), + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + public static final Setting FORECAST_MAX_RETRY_FOR_END_RUN_EXCEPTION = Setting + .intSetting("plugins.forecast.max_retry_for_end_run_exception", 6, 0, Setting.Property.NodeScope, Setting.Property.Dynamic); + + // ====================================== + // cache related parameters + // ====================================== + /* + * Opensearch-only setting + * Each forecaster has its dedicated cache that stores ten entities' states per node for HC + * and one entity' state per node for single-stream forecaster. + * A forecaster's hottest entities load their states into the dedicated cache. + * Other forecasters cannot use space reserved by a forecaster's dedicated cache. + * DEDICATED_CACHE_SIZE is a setting to make dedicated cache's size flexible. + * When that setting is changed, if the size decreases, we will release memory + * if required (e.g., when a user also decreased ForecastSettings.FORECAST_MODEL_MAX_SIZE_PERCENTAGE, + * the max memory percentage that forecasting plugin can use); + * if the size increases, we may reject the setting change if we cannot fulfill + * that request (e.g., when it will uses more memory than allowed for Forecasting). + * + * With compact rcf, rcf with 30 trees and shingle size 4 is of 500KB. + * The recommended max heap size is 32 GB. Even if users use all of the heap + * for Forecasting, the max number of entity model cannot surpass + * 3.2 GB/500KB = 3.2 * 10^10 / 5*10^5 = 6.4 * 10 ^4 + * where 3.2 GB is from 10% memory limit of AD plugin. + * That's why I am using 60_000 as the max limit. + */ + public static final Setting FORECAST_DEDICATED_CACHE_SIZE = Setting + .intSetting("plugins.forecast.dedicated_cache_size", 10, 0, 60_000, Setting.Property.NodeScope, Setting.Property.Dynamic); + + public static final Setting FORECAST_MODEL_MAX_SIZE_PERCENTAGE = Setting + .doubleSetting("plugins.forecast.model_max_size_percent", 0.1, 0, 0.9, Setting.Property.NodeScope, Setting.Property.Dynamic); + + // ====================================== + // pagination setting + // ====================================== + // pagination size + public static final Setting FORECAST_PAGE_SIZE = Setting + .intSetting("plugins.forecast.page_size", 1_000, 0, 10_000, Setting.Property.NodeScope, Setting.Property.Dynamic); + + // Increase the value will adding pressure to indexing anomaly results and our feature query + // OpenSearch-only setting as previous the legacy default is too low (1000) + public static final Setting FORECAST_MAX_ENTITIES_PER_INTERVAL = Setting + .intSetting( + "plugins.forecast.max_entities_per_interval", + 1_000_000, + 0, + 2_000_000, + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + // ====================================== + // stats/profile API setting + // ====================================== + // the max number of models to return per node. + // the setting is used to limit resource usage due to showing models + public static final Setting FORECAST_MAX_MODEL_SIZE_PER_NODE = Setting + .intSetting("plugins.forecast.max_model_size_per_node", 100, 1, 10_000, Setting.Property.NodeScope, Setting.Property.Dynamic); + +} diff --git a/src/main/java/org/opensearch/timeseries/AnalysisType.java b/src/main/java/org/opensearch/timeseries/AnalysisType.java new file mode 100644 index 000000000..7d7cc805e --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/AnalysisType.java @@ -0,0 +1,11 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries; + +public enum AnalysisType { + AD, + FORECAST +} diff --git a/src/main/java/org/opensearch/ad/CleanState.java b/src/main/java/org/opensearch/timeseries/CleanState.java similarity index 94% rename from src/main/java/org/opensearch/ad/CleanState.java rename to src/main/java/org/opensearch/timeseries/CleanState.java index ae8085e88..fac03b453 100644 --- a/src/main/java/org/opensearch/ad/CleanState.java +++ b/src/main/java/org/opensearch/timeseries/CleanState.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad; +package org.opensearch.timeseries; /** * Represent a state organized via detectorId. When deleting a detector's state, diff --git a/src/main/java/org/opensearch/timeseries/ExceptionRecorder.java b/src/main/java/org/opensearch/timeseries/ExceptionRecorder.java new file mode 100644 index 000000000..5b692e96f --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/ExceptionRecorder.java @@ -0,0 +1,20 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.timeseries; + +import java.util.Optional; + +public interface ExceptionRecorder { + public void setException(String id, Exception e); + + public Optional fetchExceptionAndClear(String id); +} diff --git a/src/main/java/org/opensearch/ad/ExpiringState.java b/src/main/java/org/opensearch/timeseries/ExpiringState.java similarity index 94% rename from src/main/java/org/opensearch/ad/ExpiringState.java rename to src/main/java/org/opensearch/timeseries/ExpiringState.java index 0df0e1f51..f5e6d3669 100644 --- a/src/main/java/org/opensearch/ad/ExpiringState.java +++ b/src/main/java/org/opensearch/timeseries/ExpiringState.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad; +package org.opensearch.timeseries; import java.time.Duration; import java.time.Instant; diff --git a/src/main/java/org/opensearch/ad/MaintenanceState.java b/src/main/java/org/opensearch/timeseries/MaintenanceState.java similarity index 96% rename from src/main/java/org/opensearch/ad/MaintenanceState.java rename to src/main/java/org/opensearch/timeseries/MaintenanceState.java index 646715f7a..07bbb9546 100644 --- a/src/main/java/org/opensearch/ad/MaintenanceState.java +++ b/src/main/java/org/opensearch/timeseries/MaintenanceState.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad; +package org.opensearch.timeseries; import java.time.Duration; import java.util.Map; diff --git a/src/main/java/org/opensearch/ad/MemoryTracker.java b/src/main/java/org/opensearch/timeseries/MemoryTracker.java similarity index 83% rename from src/main/java/org/opensearch/ad/MemoryTracker.java rename to src/main/java/org/opensearch/timeseries/MemoryTracker.java index b3d163779..1599960b3 100644 --- a/src/main/java/org/opensearch/ad/MemoryTracker.java +++ b/src/main/java/org/opensearch/timeseries/MemoryTracker.java @@ -9,9 +9,9 @@ * GitHub history for details. */ -package org.opensearch.ad; +package org.opensearch.timeseries; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE; import java.util.EnumMap; import java.util.Locale; @@ -19,55 +19,57 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.breaker.ADCircuitBreakerService; -import org.opensearch.ad.common.exception.LimitExceededException; import org.opensearch.cluster.service.ClusterService; import org.opensearch.monitor.jvm.JvmService; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.LimitExceededException; import com.amazon.randomcutforest.RandomCutForest; import com.amazon.randomcutforest.parkservices.ThresholdedRandomCutForest; /** - * Class to track AD memory usage. + * Responsible for tracking and managing the memory consumption related to OpenSearch time series analysis. + * It offers functionalities to: + * - Track the total memory consumption and consumption per specific origin. + * - Monitor reserved memory bytes. + * - Decide if memory can be allocated based on the current usage and the heap limit. + * - Estimate the memory size for a ThresholdedRandomCutForest model based on various parameters. * */ public class MemoryTracker { private static final Logger LOG = LogManager.getLogger(MemoryTracker.class); public enum Origin { - SINGLE_ENTITY_DETECTOR, - HC_DETECTOR, + REAL_TIME_DETECTOR, HISTORICAL_SINGLE_ENTITY_DETECTOR, + REAL_TIME_FORECASTER } // memory tracker for total consumption of bytes - private long totalMemoryBytes; - private final Map totalMemoryBytesByOrigin; + protected long totalMemoryBytes; + protected final Map totalMemoryBytesByOrigin; // reserved for models. Cannot be deleted at will. - private long reservedMemoryBytes; - private final Map reservedMemoryBytesByOrigin; - private long heapSize; - private long heapLimitBytes; - private long desiredModelSize; + protected long reservedMemoryBytes; + protected final Map reservedMemoryBytesByOrigin; + protected long heapSize; + protected long heapLimitBytes; // we observe threshold model uses a fixed size array and the size is the same - private int thresholdModelBytes; - private ADCircuitBreakerService adCircuitBreakerService; + protected int thresholdModelBytes; + protected CircuitBreakerService timeSeriesCircuitBreakerService; /** * Constructor * * @param jvmService Service providing jvm info * @param modelMaxSizePercentage Percentage of heap for the max size of a model - * @param modelDesiredSizePercentage percentage of heap for the desired size of a model * @param clusterService Cluster service object - * @param adCircuitBreakerService Memory circuit breaker + * @param timeSeriesCircuitBreakerService Memory circuit breaker */ public MemoryTracker( JvmService jvmService, double modelMaxSizePercentage, - double modelDesiredSizePercentage, ClusterService clusterService, - ADCircuitBreakerService adCircuitBreakerService + CircuitBreakerService timeSeriesCircuitBreakerService ) { this.totalMemoryBytes = 0; this.totalMemoryBytesByOrigin = new EnumMap(Origin.class); @@ -75,40 +77,14 @@ public MemoryTracker( this.reservedMemoryBytesByOrigin = new EnumMap(Origin.class); this.heapSize = jvmService.info().getMem().getHeapMax().getBytes(); this.heapLimitBytes = (long) (heapSize * modelMaxSizePercentage); - this.desiredModelSize = (long) (heapSize * modelDesiredSizePercentage); if (clusterService != null) { clusterService .getClusterSettings() - .addSettingsUpdateConsumer(MODEL_MAX_SIZE_PERCENTAGE, it -> this.heapLimitBytes = (long) (heapSize * it)); + .addSettingsUpdateConsumer(AD_MODEL_MAX_SIZE_PERCENTAGE, it -> this.heapLimitBytes = (long) (heapSize * it)); } this.thresholdModelBytes = 180_000; - this.adCircuitBreakerService = adCircuitBreakerService; - } - - /** - * This function derives from the old code: https://tinyurl.com/2eaabja6 - * - * @param detectorId Detector Id - * @param trcf Thresholded random cut forest model - * @return true if there is enough memory; otherwise throw LimitExceededException. - */ - public synchronized boolean isHostingAllowed(String detectorId, ThresholdedRandomCutForest trcf) { - long requiredBytes = estimateTRCFModelSize(trcf); - if (canAllocateReserved(requiredBytes)) { - return true; - } else { - throw new LimitExceededException( - detectorId, - String - .format( - Locale.ROOT, - "Exceeded memory limit. New size is %d bytes and max limit is %d bytes", - reservedMemoryBytes + requiredBytes, - heapLimitBytes - ) - ); - } + this.timeSeriesCircuitBreakerService = timeSeriesCircuitBreakerService; } /** @@ -117,7 +93,7 @@ public synchronized boolean isHostingAllowed(String detectorId, ThresholdedRando * true when circuit breaker is closed and there is enough reserved memory. */ public synchronized boolean canAllocateReserved(long requiredBytes) { - return (false == adCircuitBreakerService.isOpen() && reservedMemoryBytes + requiredBytes <= heapLimitBytes); + return (false == timeSeriesCircuitBreakerService.isOpen() && reservedMemoryBytes + requiredBytes <= heapLimitBytes); } /** @@ -126,7 +102,7 @@ public synchronized boolean canAllocateReserved(long requiredBytes) { * true when circuit breaker is closed and there is enough overall memory. */ public synchronized boolean canAllocate(long bytes) { - return false == adCircuitBreakerService.isOpen() && totalMemoryBytes + bytes <= heapLimitBytes; + return false == timeSeriesCircuitBreakerService.isOpen() && totalMemoryBytes + bytes <= heapLimitBytes; } public synchronized void consumeMemory(long memoryToConsume, boolean reserved, Origin origin) { @@ -159,23 +135,6 @@ private void adjustOriginMemoryRelease(long memoryToConsume, Origin origin, Map< } } - /** - * Gets the estimated size of an entity's model. - * - * @param trcf ThresholdedRandomCutForest object - * @return estimated model size in bytes - */ - public long estimateTRCFModelSize(ThresholdedRandomCutForest trcf) { - RandomCutForest forest = trcf.getForest(); - return estimateTRCFModelSize( - forest.getDimensions(), - forest.getNumberOfTrees(), - forest.getBoundingBoxCacheFraction(), - forest.getShingleSize(), - forest.isInternalShinglingEnabled() - ); - } - /** * Gets the estimated size of an entity's model. * @@ -306,14 +265,6 @@ public long getHeapLimit() { return heapLimitBytes; } - /** - * - * @return Desired model partition size in bytes - */ - public long getDesiredModelSize() { - return desiredModelSize; - } - public long getTotalMemoryBytes() { return totalMemoryBytes; } @@ -360,4 +311,56 @@ public synchronized boolean syncMemoryState(Origin origin, long totalBytes, long public int getThresholdModelBytes() { return thresholdModelBytes; } + + /** + * Determines if hosting is allowed based on the estimated size of a given ThresholdedRandomCutForest and + * the available memory resources. + * + *

This method synchronizes access to ensure that checks and operations related to resource availability + * are thread-safe. + * + * @param configId The identifier for the configuration being checked. Used in error messages. + * @param trcf The ThresholdedRandomCutForest to estimate the size for. + * @return True if the system can allocate the required bytes to host the trcf. + * @throws LimitExceededException If the required memory for the trcf exceeds the available memory. + * + *

Usage example: + *

{@code
+     * boolean canHost = isHostingAllowed("config123", myTRCF);
+     * }
+ */ + public synchronized boolean isHostingAllowed(String configId, ThresholdedRandomCutForest trcf) { + long requiredBytes = estimateTRCFModelSize(trcf); + if (canAllocateReserved(requiredBytes)) { + return true; + } else { + throw new LimitExceededException( + configId, + String + .format( + Locale.ROOT, + "Exceeded memory limit. New size is %d bytes and max limit is %d bytes", + reservedMemoryBytes + requiredBytes, + heapLimitBytes + ) + ); + } + } + + /** + * Gets the estimated size of an entity's model. + * + * @param trcf ThresholdedRandomCutForest object + * @return estimated model size in bytes + */ + public long estimateTRCFModelSize(ThresholdedRandomCutForest trcf) { + RandomCutForest forest = trcf.getForest(); + return estimateTRCFModelSize( + forest.getDimensions(), + forest.getNumberOfTrees(), + forest.getBoundingBoxCacheFraction(), + forest.getShingleSize(), + forest.isInternalShinglingEnabled() + ); + } } diff --git a/src/main/java/org/opensearch/ad/Name.java b/src/main/java/org/opensearch/timeseries/Name.java similarity index 95% rename from src/main/java/org/opensearch/ad/Name.java rename to src/main/java/org/opensearch/timeseries/Name.java index 4b1915d7e..d53a2a33a 100644 --- a/src/main/java/org/opensearch/ad/Name.java +++ b/src/main/java/org/opensearch/timeseries/Name.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad; +package org.opensearch.timeseries; import java.util.Collection; import java.util.HashSet; diff --git a/src/main/java/org/opensearch/ad/NodeState.java b/src/main/java/org/opensearch/timeseries/NodeState.java similarity index 56% rename from src/main/java/org/opensearch/ad/NodeState.java rename to src/main/java/org/opensearch/timeseries/NodeState.java index c12a91deb..8537d0b64 100644 --- a/src/main/java/org/opensearch/ad/NodeState.java +++ b/src/main/java/org/opensearch/timeseries/NodeState.java @@ -9,198 +9,180 @@ * GitHub history for details. */ -package org.opensearch.ad; +package org.opensearch.timeseries; import java.time.Clock; import java.time.Duration; import java.time.Instant; import java.util.Optional; -import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Job; /** * Storing intermediate state during the execution of transport action * */ public class NodeState implements ExpiringState { - private String detectorId; - // detector definition - private AnomalyDetector detectorDef; - // number of partitions - private int partitonNumber; + private String configId; + // config definition + private Config configDef; // last access time private Instant lastAccessTime; - // last detection error recorded in result index. Used by DetectorStateHandler - // to check if the error for a detector has changed or not. If changed, trigger indexing. - private Optional lastDetectionError; // last error. private Optional exception; - // flag indicating whether checkpoint for the detector exists - private boolean checkPointExists; // clock to get current time private final Clock clock; + // config job + private Job configJob; + + // AD only states + // number of partitions + private int partitonNumber; + + // flag indicating whether checkpoint for the detector exists + private boolean checkPointExists; + // cold start running flag to prevent concurrent cold start private boolean coldStartRunning; - // detector job - private AnomalyDetectorJob detectorJob; - public NodeState(String detectorId, Clock clock) { - this.detectorId = detectorId; - this.detectorDef = null; - this.partitonNumber = -1; + public NodeState(String configId, Clock clock) { + this.configId = configId; + this.configDef = null; this.lastAccessTime = clock.instant(); - this.lastDetectionError = Optional.empty(); this.exception = Optional.empty(); - this.checkPointExists = false; this.clock = clock; + this.partitonNumber = -1; + this.checkPointExists = false; this.coldStartRunning = false; - this.detectorJob = null; + this.configJob = null; } - public String getDetectorId() { - return detectorId; + public String getConfigId() { + return configId; } /** * * @return Detector configuration object */ - public AnomalyDetector getDetectorDef() { + public Config getConfigDef() { refreshLastUpdateTime(); - return detectorDef; + return configDef; } /** * - * @param detectorDef Detector configuration object + * @param configDef Analysis configuration object */ - public void setDetectorDef(AnomalyDetector detectorDef) { - this.detectorDef = detectorDef; + public void setConfigDef(Config configDef) { + this.configDef = configDef; refreshLastUpdateTime(); } /** * - * @return RCF partition number of the detector + * @return last exception if any */ - public int getPartitonNumber() { + public Optional getException() { refreshLastUpdateTime(); - return partitonNumber; + return exception; } /** * - * @param partitonNumber RCF partition number + * @param exception exception to record */ - public void setPartitonNumber(int partitonNumber) { - this.partitonNumber = partitonNumber; + public void setException(Exception exception) { + this.exception = Optional.ofNullable(exception); refreshLastUpdateTime(); } /** - * Used to indicate whether cold start succeeds or not - * @return whether checkpoint of models exists or not. + * refresh last access time. */ - public boolean doesCheckpointExists() { - refreshLastUpdateTime(); - return checkPointExists; + protected void refreshLastUpdateTime() { + lastAccessTime = clock.instant(); } /** - * - * @param checkpointExists mark whether checkpoint of models exists or not. + * @param stateTtl time to leave for the state + * @return whether the transport state is expired */ - public void setCheckpointExists(boolean checkpointExists) { - refreshLastUpdateTime(); - this.checkPointExists = checkpointExists; - }; + @Override + public boolean expired(Duration stateTtl) { + return expired(lastAccessTime, stateTtl, clock.instant()); + } /** - * - * @return last model inference error - */ - public Optional getLastDetectionError() { + * + * @return RCF partition number of the detector + */ + public int getPartitonNumber() { refreshLastUpdateTime(); - return lastDetectionError; + return partitonNumber; } /** - * - * @param lastError last model inference error - */ - public void setLastDetectionError(String lastError) { - this.lastDetectionError = Optional.ofNullable(lastError); + * + * @param partitonNumber RCF partition number + */ + public void setPartitonNumber(int partitonNumber) { + this.partitonNumber = partitonNumber; refreshLastUpdateTime(); } /** - * - * @return last exception if any - */ - public Optional getException() { + * Used to indicate whether cold start succeeds or not + * @return whether checkpoint of models exists or not. + */ + public boolean doesCheckpointExists() { refreshLastUpdateTime(); - return exception; + return checkPointExists; } /** - * - * @param exception exception to record - */ - public void setException(Exception exception) { - this.exception = Optional.ofNullable(exception); + * + * @param checkpointExists mark whether checkpoint of models exists or not. + */ + public void setCheckpointExists(boolean checkpointExists) { refreshLastUpdateTime(); - } + this.checkPointExists = checkpointExists; + }; /** - * Used to prevent concurrent cold start - * @return whether cold start is running or not - */ + * Used to prevent concurrent cold start + * @return whether cold start is running or not + */ public boolean isColdStartRunning() { refreshLastUpdateTime(); return coldStartRunning; } /** - * - * @param coldStartRunning whether cold start is running or not - */ + * + * @param coldStartRunning whether cold start is running or not + */ public void setColdStartRunning(boolean coldStartRunning) { this.coldStartRunning = coldStartRunning; refreshLastUpdateTime(); } /** - * - * @return Detector configuration object - */ - public AnomalyDetectorJob getDetectorJob() { + * + * @return Job configuration object + */ + public Job getJob() { refreshLastUpdateTime(); - return detectorJob; + return configJob; } /** - * - * @param detectorJob Detector job - */ - public void setDetectorJob(AnomalyDetectorJob detectorJob) { - this.detectorJob = detectorJob; + * + * @param job analysis job + */ + public void setJob(Job job) { + this.configJob = job; refreshLastUpdateTime(); } - - /** - * refresh last access time. - */ - private void refreshLastUpdateTime() { - lastAccessTime = clock.instant(); - } - - /** - * @param stateTtl time to leave for the state - * @return whether the transport state is expired - */ - @Override - public boolean expired(Duration stateTtl) { - return expired(lastAccessTime, stateTtl, clock.instant()); - } } diff --git a/src/main/java/org/opensearch/ad/NodeStateManager.java b/src/main/java/org/opensearch/timeseries/NodeStateManager.java similarity index 56% rename from src/main/java/org/opensearch/ad/NodeStateManager.java rename to src/main/java/org/opensearch/timeseries/NodeStateManager.java index 47d09623a..799a1b6ca 100644 --- a/src/main/java/org/opensearch/ad/NodeStateManager.java +++ b/src/main/java/org/opensearch/timeseries/NodeStateManager.java @@ -1,20 +1,13 @@ /* + * Copyright OpenSearch Contributors * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. */ -package org.opensearch.ad; +package org.opensearch.timeseries; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.BACKOFF_MINUTES; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import java.io.IOException; import java.time.Clock; import java.time.Duration; import java.util.HashMap; @@ -22,49 +15,56 @@ import java.util.Map; import java.util.Optional; import java.util.concurrent.ConcurrentHashMap; +import java.util.function.Consumer; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.message.ParameterizedMessage; import org.apache.logging.log4j.util.Strings; +import org.opensearch.OpenSearchStatusException; import org.opensearch.action.get.GetRequest; import org.opensearch.action.get.GetResponse; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.ml.SingleStreamModelIdMapper; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.transport.BackPressureRouting; -import org.opensearch.ad.util.ClientUtil; -import org.opensearch.ad.util.ExceptionUtil; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.lease.Releasable; +import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.common.unit.TimeValue; import org.opensearch.common.xcontent.LoggingDeprecationHandler; import org.opensearch.common.xcontent.XContentType; import org.opensearch.core.action.ActionListener; +import org.opensearch.core.rest.RestStatus; import org.opensearch.core.xcontent.NamedXContentRegistry; import org.opensearch.core.xcontent.XContentParser; - -/** - * NodeStateManager is used to manage states shared by transport and ml components - * like AnomalyDetector object - * - */ -public class NodeStateManager implements MaintenanceState, CleanState { +import org.opensearch.core.xcontent.XContentParserUtils; +import org.opensearch.forecast.model.Forecaster; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.function.BiCheckedFunction; +import org.opensearch.timeseries.ml.SingleStreamModelIdMapper; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.transport.BackPressureRouting; +import org.opensearch.timeseries.util.ClientUtil; +import org.opensearch.timeseries.util.ExceptionUtil; +import org.opensearch.timeseries.util.RestHandlerUtils; + +public class NodeStateManager implements MaintenanceState, CleanState, ExceptionRecorder { private static final Logger LOG = LogManager.getLogger(NodeStateManager.class); + public static final String NO_ERROR = "no_error"; - private ConcurrentHashMap states; - private Client client; - private NamedXContentRegistry xContentRegistry; - private ClientUtil clientUtil; + + protected ConcurrentHashMap states; + protected Client client; + protected NamedXContentRegistry xContentRegistry; + protected ClientUtil clientUtil; + protected final Clock clock; + protected final Duration stateTtl; // map from detector id to the map of ES node id to the node's backpressureMuter private Map> backpressureMuter; - private final Clock clock; - private final Duration stateTtl; private int maxRetryForUnresponsiveNode; private TimeValue mutePeriod; @@ -86,17 +86,20 @@ public NodeStateManager( ClientUtil clientUtil, Clock clock, Duration stateTtl, - ClusterService clusterService + ClusterService clusterService, + Setting maxRetryForUnresponsiveNodeSetting, + Setting backoffMinutesSetting ) { this.states = new ConcurrentHashMap<>(); this.client = client; this.xContentRegistry = xContentRegistry; this.clientUtil = clientUtil; - this.backpressureMuter = new ConcurrentHashMap<>(); this.clock = clock; this.stateTtl = stateTtl; - this.maxRetryForUnresponsiveNode = MAX_RETRY_FOR_UNRESPONSIVE_NODE.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(MAX_RETRY_FOR_UNRESPONSIVE_NODE, it -> { + this.backpressureMuter = new ConcurrentHashMap<>(); + + this.maxRetryForUnresponsiveNode = maxRetryForUnresponsiveNodeSetting.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(maxRetryForUnresponsiveNodeSetting, it -> { this.maxRetryForUnresponsiveNode = it; Iterator> iter = backpressureMuter.values().iterator(); while (iter.hasNext()) { @@ -104,8 +107,8 @@ public NodeStateManager( entry.values().forEach(v -> v.setMaxRetryForUnresponsiveNode(it)); } }); - this.mutePeriod = BACKOFF_MINUTES.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(BACKOFF_MINUTES, it -> { + this.mutePeriod = backoffMinutesSetting.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(backoffMinutesSetting, it -> { this.mutePeriod = it; Iterator> iter = backpressureMuter.values().iterator(); while (iter.hasNext()) { @@ -113,120 +116,37 @@ public NodeStateManager( entry.values().forEach(v -> v.setMutePeriod(it)); } }); - } - - /** - * Get Detector config object if present - * @param adID detector Id - * @return the Detecor config object or empty Optional - */ - public Optional getAnomalyDetectorIfPresent(String adID) { - NodeState state = states.get(adID); - return Optional.ofNullable(state).map(NodeState::getDetectorDef); - } - public void getAnomalyDetector(String adID, ActionListener> listener) { - NodeState state = states.get(adID); - if (state != null && state.getDetectorDef() != null) { - listener.onResponse(Optional.of(state.getDetectorDef())); - } else { - GetRequest request = new GetRequest(AnomalyDetector.ANOMALY_DETECTORS_INDEX, adID); - clientUtil.asyncRequest(request, client::get, onGetDetectorResponse(adID, listener)); - } - } - - private ActionListener onGetDetectorResponse(String adID, ActionListener> listener) { - return ActionListener.wrap(response -> { - if (response == null || !response.isExists()) { - listener.onResponse(Optional.empty()); - return; - } - - String xc = response.getSourceAsString(); - LOG.debug("Fetched anomaly detector: {}", xc); - - try ( - XContentParser parser = XContentType.JSON.xContent().createParser(xContentRegistry, LoggingDeprecationHandler.INSTANCE, xc) - ) { - ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - AnomalyDetector detector = AnomalyDetector.parse(parser, response.getId()); - // end execution if all features are disabled - if (detector.getEnabledFeatureIds().isEmpty()) { - listener - .onFailure( - new EndRunException(adID, CommonErrorMessages.ALL_FEATURES_DISABLED_ERR_MSG, true).countedInStats(false) - ); - return; - } - NodeState state = states.computeIfAbsent(adID, id -> new NodeState(id, clock)); - state.setDetectorDef(detector); - - listener.onResponse(Optional.of(detector)); - } catch (Exception t) { - LOG.error("Fail to parse detector {}", adID); - LOG.error("Stack trace:", t); - listener.onResponse(Optional.empty()); - } - }, listener::onFailure); } /** - * Get a detector's checkpoint and save a flag if we find any so that next time we don't need to do it again - * @param adID the detector's ID - * @param listener listener to handle get request + * Clean states if it is older than our stateTtl. transportState has to be a + * ConcurrentHashMap otherwise we will have + * java.util.ConcurrentModificationException. + * */ - public void getDetectorCheckpoint(String adID, ActionListener listener) { - NodeState state = states.get(adID); - if (state != null && state.doesCheckpointExists()) { - listener.onResponse(Boolean.TRUE); - return; - } - - GetRequest request = new GetRequest(CommonName.CHECKPOINT_INDEX_NAME, SingleStreamModelIdMapper.getRcfModelId(adID, 0)); - - clientUtil.asyncRequest(request, client::get, onGetCheckpointResponse(adID, listener)); - } - - private ActionListener onGetCheckpointResponse(String adID, ActionListener listener) { - return ActionListener.wrap(response -> { - if (response == null || !response.isExists()) { - listener.onResponse(Boolean.FALSE); - } else { - NodeState state = states.computeIfAbsent(adID, id -> new NodeState(id, clock)); - state.setCheckpointExists(true); - listener.onResponse(Boolean.TRUE); - } - }, listener::onFailure); + @Override + public void maintenance() { + maintenance(states, stateTtl); } /** * Used in delete workflow * - * @param detectorId detector ID + * @param configId config ID */ @Override - public void clear(String detectorId) { - Map routingMap = backpressureMuter.get(detectorId); + public void clear(String configId) { + Map routingMap = backpressureMuter.get(configId); if (routingMap != null) { routingMap.clear(); - backpressureMuter.remove(detectorId); + backpressureMuter.remove(configId); } - states.remove(detectorId); + states.remove(configId); } - /** - * Clean states if it is older than our stateTtl. transportState has to be a - * ConcurrentHashMap otherwise we will have - * java.util.ConcurrentModificationException. - * - */ - @Override - public void maintenance() { - maintenance(states, stateTtl); - } - - public boolean isMuted(String nodeId, String detectorId) { - Map routingMap = backpressureMuter.get(detectorId); + public boolean isMuted(String nodeId, String configId) { + Map routingMap = backpressureMuter.get(configId); if (routingMap == null || routingMap.isEmpty()) { return false; } @@ -237,68 +157,140 @@ public boolean isMuted(String nodeId, String detectorId) { /** * When we have a unsuccessful call with a node, increment the backpressure counter. * @param nodeId an ES node's ID - * @param detectorId Detector ID + * @param configId config ID */ - public void addPressure(String nodeId, String detectorId) { + public void addPressure(String nodeId, String configId) { Map routingMap = backpressureMuter - .computeIfAbsent(detectorId, k -> new HashMap()); + .computeIfAbsent(configId, k -> new HashMap()); routingMap.computeIfAbsent(nodeId, k -> new BackPressureRouting(k, clock, maxRetryForUnresponsiveNode, mutePeriod)).addPressure(); } /** * When we have a successful call with a node, clear the backpressure counter. * @param nodeId an ES node's ID - * @param detectorId Detector ID + * @param configId config ID */ - public void resetBackpressureCounter(String nodeId, String detectorId) { - Map routingMap = backpressureMuter.get(detectorId); + public void resetBackpressureCounter(String nodeId, String configId) { + Map routingMap = backpressureMuter.get(configId); if (routingMap == null || routingMap.isEmpty()) { - backpressureMuter.remove(detectorId); + backpressureMuter.remove(configId); return; } routingMap.remove(nodeId); } /** - * Check if there is running query on given detector - * @param detector Anomaly Detector - * @return true if given detector has a running query else false + * Get config and execute consumer function. + * [Important!] Make sure listener returns in function + * + * @param configId config id + * @param analysisType analysis type + * @param function consumer function. + * @param listener action listener. Only meant to return failure. + * @param action listener response type */ - public boolean hasRunningQuery(AnomalyDetector detector) { - return clientUtil.hasRunningQuery(detector); + public void getConfig( + String configId, + AnalysisType analysisType, + Consumer> function, + ActionListener listener + ) { + GetRequest getRequest = new GetRequest(CommonName.CONFIG_INDEX, configId); + client.get(getRequest, ActionListener.wrap(response -> { + if (!response.isExists()) { + function.accept(Optional.empty()); + return; + } + try ( + XContentParser parser = RestHandlerUtils.createXContentParserFromRegistry(xContentRegistry, response.getSourceAsBytesRef()) + ) { + XContentParserUtils.ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); + Config config = null; + if (analysisType == AnalysisType.AD) { + config = AnomalyDetector.parse(parser, response.getId(), response.getVersion()); + } else if (analysisType == AnalysisType.FORECAST) { + config = Forecaster.parse(parser, response.getId(), response.getVersion()); + } else { + throw new UnsupportedOperationException("This method is not supported"); + } + + function.accept(Optional.of(config)); + } catch (Exception e) { + String message = "Failed to parse config " + configId; + LOG.error(message, e); + listener.onFailure(new OpenSearchStatusException(message, RestStatus.INTERNAL_SERVER_ERROR)); + } + }, exception -> { + LOG.error("Failed to get config " + configId, exception); + listener.onFailure(exception); + })); } - /** - * Get last error of a detector - * @param adID detector id - * @return last error for the detector - */ - public String getLastDetectionError(String adID) { - return Optional.ofNullable(states.get(adID)).flatMap(state -> state.getLastDetectionError()).orElse(NO_ERROR); + public void getConfig(String configID, AnalysisType context, ActionListener> listener) { + NodeState state = states.get(configID); + if (state != null && state.getConfigDef() != null) { + listener.onResponse(Optional.of(state.getConfigDef())); + } else { + GetRequest request = new GetRequest(CommonName.CONFIG_INDEX, configID); + BiCheckedFunction configParser = context == AnalysisType.AD + ? AnomalyDetector::parse + : Forecaster::parse; + clientUtil.asyncRequest(request, client::get, onGetConfigResponse(configID, configParser, listener)); + } } - /** - * Set last detection error of a detector - * @param adID detector id - * @param error error, can be null - */ - public void setLastDetectionError(String adID, String error) { - NodeState state = states.computeIfAbsent(adID, id -> new NodeState(id, clock)); - state.setLastDetectionError(error); + private ActionListener onGetConfigResponse( + String configID, + BiCheckedFunction configParser, + ActionListener> listener + ) { + return ActionListener.wrap(response -> { + if (response == null || !response.isExists()) { + listener.onResponse(Optional.empty()); + return; + } + + String xc = response.getSourceAsString(); + LOG.debug("Fetched config: {}", xc); + + try ( + XContentParser parser = XContentType.JSON.xContent().createParser(xContentRegistry, LoggingDeprecationHandler.INSTANCE, xc) + ) { + ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); + Config config = configParser.apply(parser, response.getId()); + + // end execution if all features are disabled + if (config.getEnabledFeatureIds().isEmpty()) { + listener + .onFailure(new EndRunException(configID, CommonMessages.ALL_FEATURES_DISABLED_ERR_MSG, true).countedInStats(false)); + return; + } + + NodeState state = states.computeIfAbsent(configID, configId -> new NodeState(configId, clock)); + state.setConfigDef(config); + + listener.onResponse(Optional.of(config)); + } catch (Exception t) { + LOG.error("Fail to parse config {}", configID); + LOG.error("Stack trace:", t); + listener.onResponse(Optional.empty()); + } + }, listener::onFailure); } /** - * Get a detector's exception. The method has side effect. + * Get the exception of an analysis. The method has side effect. * We reset error after calling the method because - * 1) We record a detector's exception in each interval. There is no need - * to record it twice. + * 1) We record the exception of an analysis in each interval. + * There is no need to record it twice. * 2) EndRunExceptions can stop job running. We only want to send the same * signal once for each exception. - * @param adID detector id - * @return the detector's exception + * @param configID config id + * @return the config's exception */ - public Optional fetchExceptionAndClear(String adID) { - NodeState state = states.get(adID); + @Override + public Optional fetchExceptionAndClear(String configID) { + NodeState state = states.get(configID); if (state == null) { return Optional.empty(); } @@ -309,26 +301,27 @@ public Optional fetchExceptionAndClear(String adID) { } /** - * For single-stream detector, we have one exception per interval. When + * For single-stream analysis, we have one exception per interval. When * an interval starts, it fetches and clears the exception. - * For HCAD, there can be one exception per entity. To not bloat memory + * For HC analysis, there can be one exception per entity. To not bloat memory * with exceptions, we will keep only one exception. An exception has 3 purposes: - * 1) stop detector if nothing else works; + * 1) stop analysis if nothing else works; * 2) increment error stats to ticket about high-error domain * 3) debugging. * - * For HCAD, we record all entities' exceptions in anomaly results. So 3) + * For HC analysis, we record all entities' exceptions in result index. So 3) * is covered. As long as we keep one exception among all exceptions, 2) * is covered. So the only thing we have to pay attention is to keep EndRunException. * When overriding an exception, EndRunException has priority. - * @param detectorId Detector Id + * @param configId Detector Id * @param e Exception to set */ - public void setException(String detectorId, Exception e) { - if (e == null || Strings.isEmpty(detectorId)) { + @Override + public void setException(String configId, Exception e) { + if (e == null || Strings.isEmpty(configId)) { return; } - NodeState state = states.computeIfAbsent(detectorId, d -> new NodeState(detectorId, clock)); + NodeState state = states.computeIfAbsent(configId, d -> new NodeState(configId, clock)); Optional exception = state.getException(); if (exception.isPresent()) { Exception higherPriorityException = ExceptionUtil.selectHigherPriorityException(e, exception.get()); @@ -340,6 +333,35 @@ public void setException(String detectorId, Exception e) { state.setException(e); } + /** + * Get a detector's checkpoint and save a flag if we find any so that next time we don't need to do it again + * @param adID the detector's ID + * @param listener listener to handle get request + */ + public void getDetectorCheckpoint(String adID, ActionListener listener) { + NodeState state = states.get(adID); + if (state != null && state.doesCheckpointExists()) { + listener.onResponse(Boolean.TRUE); + return; + } + + GetRequest request = new GetRequest(ADCommonName.CHECKPOINT_INDEX_NAME, SingleStreamModelIdMapper.getRcfModelId(adID, 0)); + + clientUtil.asyncRequest(request, client::get, onGetCheckpointResponse(adID, listener)); + } + + private ActionListener onGetCheckpointResponse(String adID, ActionListener listener) { + return ActionListener.wrap(response -> { + if (response == null || !response.isExists()) { + listener.onResponse(Boolean.FALSE); + } else { + NodeState state = states.computeIfAbsent(adID, id -> new NodeState(id, clock)); + state.setCheckpointExists(true); + listener.onResponse(Boolean.TRUE); + } + }, listener::onFailure); + } + /** * Whether last cold start for the detector is running * @param adID detector ID @@ -370,17 +392,17 @@ public Releasable markColdStartRunning(String adID) { }; } - public void getAnomalyDetectorJob(String adID, ActionListener> listener) { - NodeState state = states.get(adID); - if (state != null && state.getDetectorJob() != null) { - listener.onResponse(Optional.of(state.getDetectorJob())); + public void getJob(String configID, ActionListener> listener) { + NodeState state = states.get(configID); + if (state != null && state.getJob() != null) { + listener.onResponse(Optional.of(state.getJob())); } else { - GetRequest request = new GetRequest(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, adID); - clientUtil.asyncRequest(request, client::get, onGetDetectorJobResponse(adID, listener)); + GetRequest request = new GetRequest(CommonName.JOB_INDEX, configID); + clientUtil.asyncRequest(request, client::get, onGetJobResponse(configID, listener)); } } - private ActionListener onGetDetectorJobResponse(String adID, ActionListener> listener) { + private ActionListener onGetJobResponse(String configID, ActionListener> listener) { return ActionListener.wrap(response -> { if (response == null || !response.isExists()) { listener.onResponse(Optional.empty()); @@ -388,7 +410,7 @@ private ActionListener onGetDetectorJobResponse(String adID, Action } String xc = response.getSourceAsString(); - LOG.debug("Fetched anomaly detector: {}", xc); + LOG.debug("Fetched config: {}", xc); try ( XContentParser parser = XContentType.JSON @@ -396,13 +418,13 @@ private ActionListener onGetDetectorJobResponse(String adID, Action .createParser(xContentRegistry, LoggingDeprecationHandler.INSTANCE, response.getSourceAsString()) ) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - AnomalyDetectorJob job = AnomalyDetectorJob.parse(parser); - NodeState state = states.computeIfAbsent(adID, id -> new NodeState(id, clock)); - state.setDetectorJob(job); + Job job = Job.parse(parser); + NodeState state = states.computeIfAbsent(configID, id -> new NodeState(id, clock)); + state.setJob(job); listener.onResponse(Optional.of(job)); } catch (Exception t) { - LOG.error(new ParameterizedMessage("Fail to parse job {}", adID), t); + LOG.error(new ParameterizedMessage("Fail to parse job {}", configID), t); listener.onResponse(Optional.empty()); } }, listener::onFailure); diff --git a/src/main/java/org/opensearch/ad/AnomalyDetectorPlugin.java b/src/main/java/org/opensearch/timeseries/TimeSeriesAnalyticsPlugin.java similarity index 75% rename from src/main/java/org/opensearch/ad/AnomalyDetectorPlugin.java rename to src/main/java/org/opensearch/timeseries/TimeSeriesAnalyticsPlugin.java index 6c6dd2059..7dadac650 100644 --- a/src/main/java/org/opensearch/ad/AnomalyDetectorPlugin.java +++ b/src/main/java/org/opensearch/timeseries/TimeSeriesAnalyticsPlugin.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad; +package org.opensearch.timeseries; import static java.util.Collections.unmodifiableList; @@ -33,7 +33,9 @@ import org.apache.logging.log4j.Logger; import org.opensearch.SpecialPermission; import org.opensearch.action.ActionRequest; -import org.opensearch.ad.breaker.ADCircuitBreakerService; +import org.opensearch.ad.AnomalyDetectorJobRunner; +import org.opensearch.ad.AnomalyDetectorRunner; +import org.opensearch.ad.ExecuteADResultResponseRecorder; import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.caching.EntityCache; import org.opensearch.ad.caching.PriorityCache; @@ -41,20 +43,14 @@ import org.opensearch.ad.cluster.ADDataMigrator; import org.opensearch.ad.cluster.ClusterManagerEventListener; import org.opensearch.ad.cluster.HashRing; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.dataprocessor.IntegerSensitiveSingleFeatureLinearUniformInterpolator; -import org.opensearch.ad.dataprocessor.Interpolator; -import org.opensearch.ad.dataprocessor.LinearUniformInterpolator; -import org.opensearch.ad.dataprocessor.SingleFeatureLinearUniformInterpolator; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.feature.FeatureManager; -import org.opensearch.ad.feature.SearchFeatureDao; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.ml.CheckpointDao; import org.opensearch.ad.ml.EntityColdStarter; import org.opensearch.ad.ml.HybridThresholdingModel; import org.opensearch.ad.ml.ModelManager; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.ad.model.DetectorInternalState; import org.opensearch.ad.ratelimit.CheckPointMaintainRequestAdapter; @@ -78,13 +74,12 @@ import org.opensearch.ad.rest.RestSearchTopAnomalyResultAction; import org.opensearch.ad.rest.RestStatsAnomalyDetectorAction; import org.opensearch.ad.rest.RestValidateAnomalyDetectorAction; +import org.opensearch.ad.settings.ADEnabledSetting; +import org.opensearch.ad.settings.ADNumericSetting; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.settings.EnabledSetting; import org.opensearch.ad.settings.LegacyOpenDistroAnomalyDetectorSettings; -import org.opensearch.ad.settings.NumericSetting; import org.opensearch.ad.stats.ADStat; import org.opensearch.ad.stats.ADStats; -import org.opensearch.ad.stats.StatNames; import org.opensearch.ad.stats.suppliers.CounterSupplier; import org.opensearch.ad.stats.suppliers.IndexStatusSupplier; import org.opensearch.ad.stats.suppliers.ModelsOnNodeCountSupplier; @@ -157,11 +152,7 @@ import org.opensearch.ad.transport.handler.AnomalyIndexHandler; import org.opensearch.ad.transport.handler.AnomalyResultBulkIndexHandler; import org.opensearch.ad.transport.handler.MultiEntityResultHandler; -import org.opensearch.ad.util.ClientUtil; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.ad.util.IndexUtils; -import org.opensearch.ad.util.SecurityClientUtil; -import org.opensearch.ad.util.Throttler; import org.opensearch.client.Client; import org.opensearch.cluster.metadata.IndexNameExpressionResolver; import org.opensearch.cluster.node.DiscoveryNodes; @@ -180,6 +171,8 @@ import org.opensearch.core.xcontent.XContentParserUtils; import org.opensearch.env.Environment; import org.opensearch.env.NodeEnvironment; +import org.opensearch.forecast.model.Forecaster; +import org.opensearch.forecast.settings.ForecastSettings; import org.opensearch.jobscheduler.spi.JobSchedulerExtension; import org.opensearch.jobscheduler.spi.ScheduledJobParser; import org.opensearch.jobscheduler.spi.ScheduledJobRunner; @@ -195,6 +188,18 @@ import org.opensearch.threadpool.ExecutorBuilder; import org.opensearch.threadpool.ScalingExecutorBuilder; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.dataprocessor.Imputer; +import org.opensearch.timeseries.dataprocessor.LinearUniformImputer; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.function.ThrowingSupplierWrapper; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.stats.StatNames; +import org.opensearch.timeseries.util.ClientUtil; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.watcher.ResourceWatcherService; import com.amazon.randomcutforest.parkservices.state.ThresholdedRandomCutForestMapper; @@ -213,10 +218,11 @@ /** * Entry point of AD plugin. */ -public class AnomalyDetectorPlugin extends Plugin implements ActionPlugin, ScriptPlugin, JobSchedulerExtension { +public class TimeSeriesAnalyticsPlugin extends Plugin implements ActionPlugin, ScriptPlugin, JobSchedulerExtension { - private static final Logger LOG = LogManager.getLogger(AnomalyDetectorPlugin.class); + private static final Logger LOG = LogManager.getLogger(TimeSeriesAnalyticsPlugin.class); + // AD constants public static final String LEGACY_AD_BASE = "/_opendistro/_anomaly_detection"; public static final String LEGACY_OPENDISTRO_AD_BASE_URI = LEGACY_AD_BASE + "/detectors"; public static final String AD_BASE_URI = "/_plugins/_anomaly_detection"; @@ -224,9 +230,18 @@ public class AnomalyDetectorPlugin extends Plugin implements ActionPlugin, Scrip public static final String AD_THREAD_POOL_PREFIX = "opensearch.ad."; public static final String AD_THREAD_POOL_NAME = "ad-threadpool"; public static final String AD_BATCH_TASK_THREAD_POOL_NAME = "ad-batch-task-threadpool"; - public static final String AD_JOB_TYPE = "opendistro_anomaly_detector"; + + // forecasting constants + public static final String FORECAST_BASE_URI = "/_plugins/_forecast"; + public static final String FORECAST_FORECASTERS_URI = FORECAST_BASE_URI + "/forecasters"; + public static final String FORECAST_THREAD_POOL_PREFIX = "opensearch.forecast."; + public static final String FORECAST_THREAD_POOL_NAME = "forecast-threadpool"; + public static final String FORECAST_BATCH_TASK_THREAD_POOL_NAME = "forecast-batch-task-threadpool"; + + public static final String TIME_SERIES_JOB_TYPE = "opensearch_time_series_analytics"; + private static Gson gson; - private AnomalyDetectionIndices anomalyDetectionIndices; + private ADIndexManagement anomalyDetectionIndices; private AnomalyDetectorRunner anomalyDetectorRunner; private Client client; private ClusterService clusterService; @@ -247,10 +262,10 @@ public class AnomalyDetectorPlugin extends Plugin implements ActionPlugin, Scrip SpecialPermission.check(); // gson intialization requires "java.lang.RuntimePermission" "accessDeclaredMembers" to // initialize ConstructorConstructor - AccessController.doPrivileged((PrivilegedAction) AnomalyDetectorPlugin::initGson); + AccessController.doPrivileged((PrivilegedAction) TimeSeriesAnalyticsPlugin::initGson); } - public AnomalyDetectorPlugin() {} + public TimeSeriesAnalyticsPlugin() {} @Override public List getRestHandlers( @@ -324,46 +339,50 @@ public Collection createComponents( IndexNameExpressionResolver indexNameExpressionResolver, Supplier repositoriesServiceSupplier ) { - EnabledSetting.getInstance().init(clusterService); - NumericSetting.getInstance().init(clusterService); + ADEnabledSetting.getInstance().init(clusterService); + ADNumericSetting.getInstance().init(clusterService); this.client = client; this.threadPool = threadPool; Settings settings = environment.settings(); - Throttler throttler = new Throttler(getClock()); - this.clientUtil = new ClientUtil(settings, client, throttler, threadPool); + this.clientUtil = new ClientUtil(client); this.indexUtils = new IndexUtils(client, clientUtil, clusterService, indexNameExpressionResolver); this.nodeFilter = new DiscoveryNodeFilterer(clusterService); - this.anomalyDetectionIndices = new AnomalyDetectionIndices( - client, - clusterService, - threadPool, - settings, - nodeFilter, - AnomalyDetectorSettings.MAX_UPDATE_RETRY_TIMES - ); + // convert from checked IOException to unchecked RuntimeException + this.anomalyDetectionIndices = ThrowingSupplierWrapper + .throwingSupplierWrapper( + () -> new ADIndexManagement( + client, + clusterService, + threadPool, + settings, + nodeFilter, + TimeSeriesSettings.MAX_UPDATE_RETRY_TIMES + ) + ) + .get(); this.clusterService = clusterService; - SingleFeatureLinearUniformInterpolator singleFeatureLinearUniformInterpolator = - new IntegerSensitiveSingleFeatureLinearUniformInterpolator(); - Interpolator interpolator = new LinearUniformInterpolator(singleFeatureLinearUniformInterpolator); + Imputer imputer = new LinearUniformImputer(true); stateManager = new NodeStateManager( client, xContentRegistry, settings, clientUtil, getClock(), - AnomalyDetectorSettings.HOURLY_MAINTENANCE, - clusterService + TimeSeriesSettings.HOURLY_MAINTENANCE, + clusterService, + TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE, + TimeSeriesSettings.BACKOFF_MINUTES ); securityClientUtil = new SecurityClientUtil(stateManager, settings); SearchFeatureDao searchFeatureDao = new SearchFeatureDao( client, xContentRegistry, - interpolator, + imputer, securityClientUtil, settings, clusterService, - AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE + TimeSeriesSettings.NUM_SAMPLES_PER_TREE ); JvmService jvmService = new JvmService(environment.settings()); @@ -373,21 +392,15 @@ public Collection createComponents( mapper.setPartialTreeStateEnabled(true); V1JsonToV3StateConverter converter = new V1JsonToV3StateConverter(); - double modelMaxSizePercent = AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE.get(settings); + double modelMaxSizePercent = AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE.get(settings); - ADCircuitBreakerService adCircuitBreakerService = new ADCircuitBreakerService(jvmService).init(); + CircuitBreakerService adCircuitBreakerService = new CircuitBreakerService(jvmService).init(); - MemoryTracker memoryTracker = new MemoryTracker( - jvmService, - modelMaxSizePercent, - AnomalyDetectorSettings.DESIRED_MODEL_SIZE_PERCENTAGE, - clusterService, - adCircuitBreakerService - ); + MemoryTracker memoryTracker = new MemoryTracker(jvmService, modelMaxSizePercent, clusterService, adCircuitBreakerService); FeatureManager featureManager = new FeatureManager( searchFeatureDao, - interpolator, + imputer, getClock(), AnomalyDetectorSettings.MAX_TRAIN_SAMPLE, AnomalyDetectorSettings.MAX_SAMPLE_STRIDE, @@ -397,7 +410,7 @@ public Collection createComponents( AnomalyDetectorSettings.MAX_IMPUTATION_NEIGHBOR_DISTANCE, AnomalyDetectorSettings.PREVIEW_SAMPLE_RATE, AnomalyDetectorSettings.MAX_PREVIEW_SAMPLES, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, threadPool, AD_THREAD_POOL_NAME ); @@ -410,7 +423,7 @@ public GenericObjectPool run() { return new GenericObjectPool<>(new BasePooledObjectFactory() { @Override public LinkedBuffer create() throws Exception { - return LinkedBuffer.allocate(AnomalyDetectorSettings.SERIALIZATION_BUFFER_BYTES); + return LinkedBuffer.allocate(TimeSeriesSettings.SERIALIZATION_BUFFER_BYTES); } @Override @@ -420,16 +433,16 @@ public PooledObject wrap(LinkedBuffer obj) { }); } }); - serializeRCFBufferPool.setMaxTotal(AnomalyDetectorSettings.MAX_TOTAL_RCF_SERIALIZATION_BUFFERS); - serializeRCFBufferPool.setMaxIdle(AnomalyDetectorSettings.MAX_TOTAL_RCF_SERIALIZATION_BUFFERS); + serializeRCFBufferPool.setMaxTotal(TimeSeriesSettings.MAX_TOTAL_RCF_SERIALIZATION_BUFFERS); + serializeRCFBufferPool.setMaxIdle(TimeSeriesSettings.MAX_TOTAL_RCF_SERIALIZATION_BUFFERS); serializeRCFBufferPool.setMinIdle(0); serializeRCFBufferPool.setBlockWhenExhausted(false); - serializeRCFBufferPool.setTimeBetweenEvictionRuns(AnomalyDetectorSettings.HOURLY_MAINTENANCE); + serializeRCFBufferPool.setTimeBetweenEvictionRuns(TimeSeriesSettings.HOURLY_MAINTENANCE); CheckpointDao checkpoint = new CheckpointDao( client, clientUtil, - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, gson, mapper, converter, @@ -441,10 +454,10 @@ public PooledObject wrap(LinkedBuffer obj) { ), HybridThresholdingModel.class, anomalyDetectionIndices, - AnomalyDetectorSettings.MAX_CHECKPOINT_BYTES, + TimeSeriesSettings.MAX_CHECKPOINT_BYTES, serializeRCFBufferPool, - AnomalyDetectorSettings.SERIALIZATION_BUFFER_BYTES, - 1 - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE + TimeSeriesSettings.SERIALIZATION_BUFFER_BYTES, + 1 - TimeSeriesSettings.THRESHOLD_MIN_PVALUE ); Random random = new Random(42); @@ -454,8 +467,8 @@ public PooledObject wrap(LinkedBuffer obj) { CheckPointMaintainRequestAdapter adapter = new CheckPointMaintainRequestAdapter( cacheProvider, checkpoint, - CommonName.CHECKPOINT_INDEX_NAME, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ, + ADCommonName.CHECKPOINT_INDEX_NAME, + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ, getClock(), clusterService, settings @@ -463,62 +476,62 @@ public PooledObject wrap(LinkedBuffer obj) { CheckpointWriteWorker checkpointWriteQueue = new CheckpointWriteWorker( heapSizeBytes, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_SIZE_IN_BYTES, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT, + TimeSeriesSettings.CHECKPOINT_WRITE_QUEUE_SIZE_IN_BYTES, + AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT, clusterService, random, adCircuitBreakerService, threadPool, settings, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, getClock(), - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, - AnomalyDetectorSettings.QUEUE_MAINTENANCE, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.QUEUE_MAINTENANCE, checkpoint, - CommonName.CHECKPOINT_INDEX_NAME, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + ADCommonName.CHECKPOINT_INDEX_NAME, + TimeSeriesSettings.HOURLY_MAINTENANCE, stateManager, - AnomalyDetectorSettings.HOURLY_MAINTENANCE + TimeSeriesSettings.HOURLY_MAINTENANCE ); CheckpointMaintainWorker checkpointMaintainQueue = new CheckpointMaintainWorker( heapSizeBytes, - AnomalyDetectorSettings.CHECKPOINT_MAINTAIN_REQUEST_SIZE_IN_BYTES, - AnomalyDetectorSettings.CHECKPOINT_MAINTAIN_QUEUE_MAX_HEAP_PERCENT, + TimeSeriesSettings.CHECKPOINT_MAINTAIN_REQUEST_SIZE_IN_BYTES, + AnomalyDetectorSettings.AD_CHECKPOINT_MAINTAIN_QUEUE_MAX_HEAP_PERCENT, clusterService, random, adCircuitBreakerService, threadPool, settings, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, getClock(), - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, checkpointWriteQueue, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, stateManager, adapter ); EntityCache cache = new PriorityCache( checkpoint, - AnomalyDetectorSettings.DEDICATED_CACHE_SIZE.get(settings), - AnomalyDetectorSettings.CHECKPOINT_TTL, + AnomalyDetectorSettings.AD_DEDICATED_CACHE_SIZE.get(settings), + AnomalyDetectorSettings.AD_CHECKPOINT_TTL, AnomalyDetectorSettings.MAX_INACTIVE_ENTITIES, memoryTracker, - AnomalyDetectorSettings.NUM_TREES, + TimeSeriesSettings.NUM_TREES, getClock(), clusterService, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, threadPool, checkpointWriteQueue, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, checkpointMaintainQueue, settings, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ ); cacheProvider.set(cache); @@ -527,39 +540,39 @@ public PooledObject wrap(LinkedBuffer obj) { getClock(), threadPool, stateManager, - AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE, - AnomalyDetectorSettings.NUM_TREES, - AnomalyDetectorSettings.TIME_DECAY, - AnomalyDetectorSettings.NUM_MIN_SAMPLES, + TimeSeriesSettings.NUM_SAMPLES_PER_TREE, + TimeSeriesSettings.NUM_TREES, + TimeSeriesSettings.TIME_DECAY, + TimeSeriesSettings.NUM_MIN_SAMPLES, AnomalyDetectorSettings.MAX_SAMPLE_STRIDE, AnomalyDetectorSettings.MAX_TRAIN_SAMPLE, - interpolator, + imputer, searchFeatureDao, - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE, + TimeSeriesSettings.THRESHOLD_MIN_PVALUE, featureManager, settings, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, checkpointWriteQueue, - AnomalyDetectorSettings.MAX_COLD_START_ROUNDS + TimeSeriesSettings.MAX_COLD_START_ROUNDS ); EntityColdStartWorker coldstartQueue = new EntityColdStartWorker( heapSizeBytes, AnomalyDetectorSettings.ENTITY_REQUEST_SIZE_IN_BYTES, - AnomalyDetectorSettings.ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT, clusterService, random, adCircuitBreakerService, threadPool, settings, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, getClock(), - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, - AnomalyDetectorSettings.QUEUE_MAINTENANCE, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.QUEUE_MAINTENANCE, entityColdStarter, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, stateManager, cacheProvider ); @@ -567,14 +580,14 @@ public PooledObject wrap(LinkedBuffer obj) { ModelManager modelManager = new ModelManager( checkpoint, getClock(), - AnomalyDetectorSettings.NUM_TREES, - AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE, - AnomalyDetectorSettings.TIME_DECAY, - AnomalyDetectorSettings.NUM_MIN_SAMPLES, - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE, + TimeSeriesSettings.NUM_TREES, + TimeSeriesSettings.NUM_SAMPLES_PER_TREE, + TimeSeriesSettings.TIME_DECAY, + TimeSeriesSettings.NUM_MIN_SAMPLES, + TimeSeriesSettings.THRESHOLD_MIN_PVALUE, AnomalyDetectorSettings.MIN_PREVIEW_SIZE, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ, + TimeSeriesSettings.HOURLY_MAINTENANCE, + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ, entityColdStarter, featureManager, memoryTracker, @@ -594,23 +607,23 @@ public PooledObject wrap(LinkedBuffer obj) { ResultWriteWorker resultWriteQueue = new ResultWriteWorker( heapSizeBytes, - AnomalyDetectorSettings.RESULT_WRITE_QUEUE_SIZE_IN_BYTES, - AnomalyDetectorSettings.RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT, + TimeSeriesSettings.RESULT_WRITE_QUEUE_SIZE_IN_BYTES, + AnomalyDetectorSettings.AD_RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT, clusterService, random, adCircuitBreakerService, threadPool, settings, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, getClock(), - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, - AnomalyDetectorSettings.QUEUE_MAINTENANCE, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.QUEUE_MAINTENANCE, multiEntityResultHandler, xContentRegistry, stateManager, - AnomalyDetectorSettings.HOURLY_MAINTENANCE + TimeSeriesSettings.HOURLY_MAINTENANCE ); Map> stats = ImmutableMap @@ -625,23 +638,23 @@ public PooledObject wrap(LinkedBuffer obj) { ) .put( StatNames.ANOMALY_DETECTORS_INDEX_STATUS.getName(), - new ADStat<>(true, new IndexStatusSupplier(indexUtils, AnomalyDetector.ANOMALY_DETECTORS_INDEX)) + new ADStat<>(true, new IndexStatusSupplier(indexUtils, CommonName.CONFIG_INDEX)) ) .put( StatNames.ANOMALY_RESULTS_INDEX_STATUS.getName(), - new ADStat<>(true, new IndexStatusSupplier(indexUtils, CommonName.ANOMALY_RESULT_INDEX_ALIAS)) + new ADStat<>(true, new IndexStatusSupplier(indexUtils, ADCommonName.ANOMALY_RESULT_INDEX_ALIAS)) ) .put( StatNames.MODELS_CHECKPOINT_INDEX_STATUS.getName(), - new ADStat<>(true, new IndexStatusSupplier(indexUtils, CommonName.CHECKPOINT_INDEX_NAME)) + new ADStat<>(true, new IndexStatusSupplier(indexUtils, ADCommonName.CHECKPOINT_INDEX_NAME)) ) .put( StatNames.ANOMALY_DETECTION_JOB_INDEX_STATUS.getName(), - new ADStat<>(true, new IndexStatusSupplier(indexUtils, AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX)) + new ADStat<>(true, new IndexStatusSupplier(indexUtils, CommonName.JOB_INDEX)) ) .put( StatNames.ANOMALY_DETECTION_STATE_STATUS.getName(), - new ADStat<>(true, new IndexStatusSupplier(indexUtils, CommonName.DETECTION_STATE_INDEX)) + new ADStat<>(true, new IndexStatusSupplier(indexUtils, ADCommonName.DETECTION_STATE_INDEX)) ) .put(StatNames.DETECTOR_COUNT.getName(), new ADStat<>(true, new SettableSupplier())) .put(StatNames.SINGLE_ENTITY_DETECTOR_COUNT.getName(), new ADStat<>(true, new SettableSupplier())) @@ -659,18 +672,18 @@ public PooledObject wrap(LinkedBuffer obj) { CheckpointReadWorker checkpointReadQueue = new CheckpointReadWorker( heapSizeBytes, AnomalyDetectorSettings.ENTITY_FEATURE_REQUEST_SIZE_IN_BYTES, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, clusterService, random, adCircuitBreakerService, threadPool, settings, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, getClock(), - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, - AnomalyDetectorSettings.QUEUE_MAINTENANCE, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.QUEUE_MAINTENANCE, modelManager, checkpoint, coldstartQueue, @@ -678,7 +691,7 @@ public PooledObject wrap(LinkedBuffer obj) { stateManager, anomalyDetectionIndices, cacheProvider, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, checkpointWriteQueue, adStats ); @@ -686,19 +699,19 @@ public PooledObject wrap(LinkedBuffer obj) { ColdEntityWorker coldEntityQueue = new ColdEntityWorker( heapSizeBytes, AnomalyDetectorSettings.ENTITY_FEATURE_REQUEST_SIZE_IN_BYTES, - AnomalyDetectorSettings.COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, clusterService, random, adCircuitBreakerService, threadPool, settings, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, getClock(), - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, checkpointReadQueue, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, stateManager ); @@ -752,7 +765,7 @@ public PooledObject wrap(LinkedBuffer obj) { client, settings, threadPool, - CommonName.ANOMALY_RESULT_INDEX_ALIAS, + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, anomalyDetectionIndices, this.clientUtil, this.indexUtils, @@ -768,7 +781,7 @@ public PooledObject wrap(LinkedBuffer obj) { client, stateManager, adTaskCacheManager, - AnomalyDetectorSettings.NUM_MIN_SAMPLES + TimeSeriesSettings.NUM_MIN_SAMPLES ); // return objects used by Guice to inject dependencies for e.g., @@ -778,8 +791,7 @@ public PooledObject wrap(LinkedBuffer obj) { anomalyDetectionIndices, anomalyDetectorRunner, searchFeatureDao, - singleFeatureLinearUniformInterpolator, - interpolator, + imputer, gson, jvmService, hashRing, @@ -796,7 +808,7 @@ public PooledObject wrap(LinkedBuffer obj) { getClock(), clientUtil, nodeFilter, - AnomalyDetectorSettings.CHECKPOINT_TTL, + AnomalyDetectorSettings.AD_CHECKPOINT_TTL, settings ), nodeFilter, @@ -851,14 +863,17 @@ public List> getExecutorBuilders(Settings settings) { @Override public List> getSettings() { - List> enabledSetting = EnabledSetting.getInstance().getSettings(); - List> numericSetting = NumericSetting.getInstance().getSettings(); + List> enabledSetting = ADEnabledSetting.getInstance().getSettings(); + List> numericSetting = ADNumericSetting.getInstance().getSettings(); List> systemSetting = ImmutableList .of( + // ====================================== + // AD settings + // ====================================== // HCAD cache LegacyOpenDistroAnomalyDetectorSettings.MAX_CACHE_MISS_HANDLING_PER_SECOND, - AnomalyDetectorSettings.DEDICATED_CACHE_SIZE, + AnomalyDetectorSettings.AD_DEDICATED_CACHE_SIZE, // Detector config LegacyOpenDistroAnomalyDetectorSettings.DETECTION_INTERVAL, LegacyOpenDistroAnomalyDetectorSettings.DETECTION_WINDOW_DELAY, @@ -873,12 +888,12 @@ public List> getSettings() { LegacyOpenDistroAnomalyDetectorSettings.BACKOFF_MINUTES, LegacyOpenDistroAnomalyDetectorSettings.BACKOFF_INITIAL_DELAY, LegacyOpenDistroAnomalyDetectorSettings.MAX_RETRY_FOR_BACKOFF, - AnomalyDetectorSettings.REQUEST_TIMEOUT, - AnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE, - AnomalyDetectorSettings.COOLDOWN_MINUTES, - AnomalyDetectorSettings.BACKOFF_MINUTES, - AnomalyDetectorSettings.BACKOFF_INITIAL_DELAY, - AnomalyDetectorSettings.MAX_RETRY_FOR_BACKOFF, + AnomalyDetectorSettings.AD_REQUEST_TIMEOUT, + AnomalyDetectorSettings.AD_MAX_RETRY_FOR_UNRESPONSIVE_NODE, + AnomalyDetectorSettings.AD_COOLDOWN_MINUTES, + AnomalyDetectorSettings.AD_BACKOFF_MINUTES, + AnomalyDetectorSettings.AD_BACKOFF_INITIAL_DELAY, + AnomalyDetectorSettings.AD_MAX_RETRY_FOR_BACKOFF, // result index rollover LegacyOpenDistroAnomalyDetectorSettings.AD_RESULT_HISTORY_ROLLOVER_PERIOD, LegacyOpenDistroAnomalyDetectorSettings.AD_RESULT_HISTORY_MAX_DOCS, @@ -892,15 +907,15 @@ public List> getSettings() { LegacyOpenDistroAnomalyDetectorSettings.MAX_MULTI_ENTITY_ANOMALY_DETECTORS, LegacyOpenDistroAnomalyDetectorSettings.INDEX_PRESSURE_SOFT_LIMIT, LegacyOpenDistroAnomalyDetectorSettings.MAX_PRIMARY_SHARDS, - AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE, - AnomalyDetectorSettings.MAX_SINGLE_ENTITY_ANOMALY_DETECTORS, - AnomalyDetectorSettings.MAX_MULTI_ENTITY_ANOMALY_DETECTORS, - AnomalyDetectorSettings.INDEX_PRESSURE_SOFT_LIMIT, - AnomalyDetectorSettings.INDEX_PRESSURE_HARD_LIMIT, - AnomalyDetectorSettings.MAX_PRIMARY_SHARDS, + AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE, + AnomalyDetectorSettings.AD_MAX_SINGLE_ENTITY_ANOMALY_DETECTORS, + AnomalyDetectorSettings.AD_MAX_HC_ANOMALY_DETECTORS, + AnomalyDetectorSettings.AD_INDEX_PRESSURE_SOFT_LIMIT, + AnomalyDetectorSettings.AD_INDEX_PRESSURE_HARD_LIMIT, + AnomalyDetectorSettings.AD_MAX_PRIMARY_SHARDS, // Security - LegacyOpenDistroAnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES, - AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES, + LegacyOpenDistroAnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES, + AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES, // Historical LegacyOpenDistroAnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE, LegacyOpenDistroAnomalyDetectorSettings.BATCH_TASK_PIECE_INTERVAL_SECONDS, @@ -914,34 +929,58 @@ public List> getSettings() { AnomalyDetectorSettings.MAX_RUNNING_ENTITIES_PER_DETECTOR_FOR_HISTORICAL_ANALYSIS, AnomalyDetectorSettings.MAX_CACHED_DELETED_TASKS, // rate limiting - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_CONCURRENCY, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_CONCURRENCY, - AnomalyDetectorSettings.ENTITY_COLD_START_QUEUE_CONCURRENCY, - AnomalyDetectorSettings.RESULT_WRITE_QUEUE_CONCURRENCY, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_BATCH_SIZE, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_BATCH_SIZE, - AnomalyDetectorSettings.RESULT_WRITE_QUEUE_BATCH_SIZE, - AnomalyDetectorSettings.COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.CHECKPOINT_MAINTAIN_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS, - AnomalyDetectorSettings.EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ, - AnomalyDetectorSettings.CHECKPOINT_TTL, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_CONCURRENCY, + AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_CONCURRENCY, + AnomalyDetectorSettings.AD_ENTITY_COLD_START_QUEUE_CONCURRENCY, + AnomalyDetectorSettings.AD_RESULT_WRITE_QUEUE_CONCURRENCY, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_BATCH_SIZE, + AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE, + AnomalyDetectorSettings.AD_RESULT_WRITE_QUEUE_BATCH_SIZE, + AnomalyDetectorSettings.AD_COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_MAINTAIN_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS, + AnomalyDetectorSettings.AD_EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS, + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ, + AnomalyDetectorSettings.AD_CHECKPOINT_TTL, // query limit LegacyOpenDistroAnomalyDetectorSettings.MAX_ENTITIES_PER_QUERY, LegacyOpenDistroAnomalyDetectorSettings.MAX_ENTITIES_FOR_PREVIEW, - AnomalyDetectorSettings.MAX_ENTITIES_PER_QUERY, + AnomalyDetectorSettings.AD_MAX_ENTITIES_PER_QUERY, AnomalyDetectorSettings.MAX_ENTITIES_FOR_PREVIEW, AnomalyDetectorSettings.MAX_CONCURRENT_PREVIEW, - AnomalyDetectorSettings.PAGE_SIZE, + AnomalyDetectorSettings.AD_PAGE_SIZE, // clean resource AnomalyDetectorSettings.DELETE_AD_RESULT_WHEN_DELETE_DETECTOR, // stats/profile API - AnomalyDetectorSettings.MAX_MODEL_SIZE_PER_NODE + AnomalyDetectorSettings.AD_MAX_MODEL_SIZE_PER_NODE, + // ====================================== + // Forecast settings + // ====================================== + // result index rollover + ForecastSettings.FORECAST_RESULT_HISTORY_MAX_DOCS_PER_SHARD, + ForecastSettings.FORECAST_RESULT_HISTORY_RETENTION_PERIOD, + ForecastSettings.FORECAST_RESULT_HISTORY_ROLLOVER_PERIOD, + // resource usage control + ForecastSettings.FORECAST_MODEL_MAX_SIZE_PERCENTAGE, + // TODO: add validation code + // ForecastSettings.FORECAST_MAX_SINGLE_STREAM_FORECASTERS, + // ForecastSettings.FORECAST_MAX_HC_FORECASTERS, + ForecastSettings.FORECAST_INDEX_PRESSURE_SOFT_LIMIT, + ForecastSettings.FORECAST_INDEX_PRESSURE_HARD_LIMIT, + ForecastSettings.FORECAST_MAX_PRIMARY_SHARDS, + // ====================================== + // Common settings + // ====================================== + // Fault tolerance + TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE, + TimeSeriesSettings.BACKOFF_MINUTES, + TimeSeriesSettings.COOLDOWN_MINUTES, + // tasks + TimeSeriesSettings.MAX_CACHED_DELETED_TASKS ); return unmodifiableList( Stream @@ -959,7 +998,8 @@ public List getNamedXContent() { AnomalyDetector.XCONTENT_REGISTRY, AnomalyResult.XCONTENT_REGISTRY, DetectorInternalState.XCONTENT_REGISTRY, - AnomalyDetectorJob.XCONTENT_REGISTRY + Job.XCONTENT_REGISTRY, + Forecaster.XCONTENT_REGISTRY ); } @@ -1005,12 +1045,12 @@ public List getNamedXContent() { @Override public String getJobType() { - return AD_JOB_TYPE; + return TIME_SERIES_JOB_TYPE; } @Override public String getJobIndex() { - return AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; + return CommonName.JOB_INDEX; } @Override @@ -1022,7 +1062,7 @@ public ScheduledJobRunner getJobRunner() { public ScheduledJobParser getJobParser() { return (parser, id, jobDocVersion) -> { XContentParserUtils.ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - return AnomalyDetectorJob.parse(parser); + return Job.parse(parser); }; } diff --git a/src/main/java/org/opensearch/ad/annotation/Generated.java b/src/main/java/org/opensearch/timeseries/annotation/Generated.java similarity index 94% rename from src/main/java/org/opensearch/ad/annotation/Generated.java rename to src/main/java/org/opensearch/timeseries/annotation/Generated.java index 08156483a..d3812c9ca 100644 --- a/src/main/java/org/opensearch/ad/annotation/Generated.java +++ b/src/main/java/org/opensearch/timeseries/annotation/Generated.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.annotation; +package org.opensearch.timeseries.annotation; import java.lang.annotation.ElementType; import java.lang.annotation.Retention; diff --git a/src/main/java/org/opensearch/ad/breaker/BreakerName.java b/src/main/java/org/opensearch/timeseries/breaker/BreakerName.java similarity index 92% rename from src/main/java/org/opensearch/ad/breaker/BreakerName.java rename to src/main/java/org/opensearch/timeseries/breaker/BreakerName.java index a6405cf1f..5c744355b 100644 --- a/src/main/java/org/opensearch/ad/breaker/BreakerName.java +++ b/src/main/java/org/opensearch/timeseries/breaker/BreakerName.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.breaker; +package org.opensearch.timeseries.breaker; public enum BreakerName { diff --git a/src/main/java/org/opensearch/ad/breaker/CircuitBreaker.java b/src/main/java/org/opensearch/timeseries/breaker/CircuitBreaker.java similarity index 91% rename from src/main/java/org/opensearch/ad/breaker/CircuitBreaker.java rename to src/main/java/org/opensearch/timeseries/breaker/CircuitBreaker.java index 2825d2f98..5258ac64e 100644 --- a/src/main/java/org/opensearch/ad/breaker/CircuitBreaker.java +++ b/src/main/java/org/opensearch/timeseries/breaker/CircuitBreaker.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.breaker; +package org.opensearch.timeseries.breaker; /** * An interface for circuit breaker. diff --git a/src/main/java/org/opensearch/ad/breaker/ADCircuitBreakerService.java b/src/main/java/org/opensearch/timeseries/breaker/CircuitBreakerService.java similarity index 80% rename from src/main/java/org/opensearch/ad/breaker/ADCircuitBreakerService.java rename to src/main/java/org/opensearch/timeseries/breaker/CircuitBreakerService.java index 7d4667576..efa48ec7f 100644 --- a/src/main/java/org/opensearch/ad/breaker/ADCircuitBreakerService.java +++ b/src/main/java/org/opensearch/timeseries/breaker/CircuitBreakerService.java @@ -9,34 +9,34 @@ * GitHub history for details. */ -package org.opensearch.ad.breaker; +package org.opensearch.timeseries.breaker; import java.util.concurrent.ConcurrentHashMap; import java.util.concurrent.ConcurrentMap; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.monitor.jvm.JvmService; /** - * Class {@code ADCircuitBreakerService} provide storing, retrieving circuit breakers functions. + * Class {@code CircuitBreakerService} provide storing, retrieving circuit breakers functions. * * This service registers internal system breakers and provide API for users to register their own breakers. */ -public class ADCircuitBreakerService { +public class CircuitBreakerService { private final ConcurrentMap breakers = new ConcurrentHashMap<>(); private final JvmService jvmService; - private static final Logger logger = LogManager.getLogger(ADCircuitBreakerService.class); + private static final Logger logger = LogManager.getLogger(CircuitBreakerService.class); /** * Constructor. * * @param jvmService jvm info */ - public ADCircuitBreakerService(JvmService jvmService) { + public CircuitBreakerService(JvmService jvmService) { this.jvmService = jvmService; } @@ -67,7 +67,7 @@ public CircuitBreaker getBreaker(String name) { * * @return ADCircuitBreakerService */ - public ADCircuitBreakerService init() { + public CircuitBreakerService init() { // Register memory circuit breaker registerBreaker(BreakerName.MEM.getName(), new MemoryCircuitBreaker(this.jvmService)); logger.info("Registered memory breaker."); @@ -76,7 +76,7 @@ public ADCircuitBreakerService init() { } public Boolean isOpen() { - if (!EnabledSetting.isADBreakerEnabled()) { + if (!ADEnabledSetting.isADBreakerEnabled()) { return false; } diff --git a/src/main/java/org/opensearch/ad/breaker/MemoryCircuitBreaker.java b/src/main/java/org/opensearch/timeseries/breaker/MemoryCircuitBreaker.java similarity index 95% rename from src/main/java/org/opensearch/ad/breaker/MemoryCircuitBreaker.java rename to src/main/java/org/opensearch/timeseries/breaker/MemoryCircuitBreaker.java index c4628c639..cf4b47d71 100644 --- a/src/main/java/org/opensearch/ad/breaker/MemoryCircuitBreaker.java +++ b/src/main/java/org/opensearch/timeseries/breaker/MemoryCircuitBreaker.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.breaker; +package org.opensearch.timeseries.breaker; import org.opensearch.monitor.jvm.JvmService; diff --git a/src/main/java/org/opensearch/ad/breaker/ThresholdCircuitBreaker.java b/src/main/java/org/opensearch/timeseries/breaker/ThresholdCircuitBreaker.java similarity index 94% rename from src/main/java/org/opensearch/ad/breaker/ThresholdCircuitBreaker.java rename to src/main/java/org/opensearch/timeseries/breaker/ThresholdCircuitBreaker.java index 30959b0c4..5d69ce1f9 100644 --- a/src/main/java/org/opensearch/ad/breaker/ThresholdCircuitBreaker.java +++ b/src/main/java/org/opensearch/timeseries/breaker/ThresholdCircuitBreaker.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.breaker; +package org.opensearch.timeseries.breaker; /** * An abstract class for all breakers with threshold. diff --git a/src/main/java/org/opensearch/timeseries/common/exception/ClientException.java b/src/main/java/org/opensearch/timeseries/common/exception/ClientException.java new file mode 100644 index 000000000..d8be97f37 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/common/exception/ClientException.java @@ -0,0 +1,34 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.timeseries.common.exception; + +/** + * All exception visible to transport layer's client is under ClientException. + */ +public class ClientException extends TimeSeriesException { + + public ClientException(String message) { + super(message); + } + + public ClientException(String configId, String message) { + super(configId, message); + } + + public ClientException(String configId, String message, Throwable throwable) { + super(configId, message, throwable); + } + + public ClientException(String configId, Throwable cause) { + super(configId, cause); + } +} diff --git a/src/main/java/org/opensearch/ad/common/exception/DuplicateTaskException.java b/src/main/java/org/opensearch/timeseries/common/exception/DuplicateTaskException.java similarity index 77% rename from src/main/java/org/opensearch/ad/common/exception/DuplicateTaskException.java rename to src/main/java/org/opensearch/timeseries/common/exception/DuplicateTaskException.java index 378e3cc2a..1791e322d 100644 --- a/src/main/java/org/opensearch/ad/common/exception/DuplicateTaskException.java +++ b/src/main/java/org/opensearch/timeseries/common/exception/DuplicateTaskException.java @@ -9,9 +9,9 @@ * GitHub history for details. */ -package org.opensearch.ad.common.exception; +package org.opensearch.timeseries.common.exception; -public class DuplicateTaskException extends AnomalyDetectionException { +public class DuplicateTaskException extends TimeSeriesException { public DuplicateTaskException(String msg) { super(msg); diff --git a/src/main/java/org/opensearch/ad/common/exception/EndRunException.java b/src/main/java/org/opensearch/timeseries/common/exception/EndRunException.java similarity index 75% rename from src/main/java/org/opensearch/ad/common/exception/EndRunException.java rename to src/main/java/org/opensearch/timeseries/common/exception/EndRunException.java index 2408b77b7..a4b11c621 100644 --- a/src/main/java/org/opensearch/ad/common/exception/EndRunException.java +++ b/src/main/java/org/opensearch/timeseries/common/exception/EndRunException.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.common.exception; +package org.opensearch.timeseries.common.exception; /** * Exception for failures that might impact the customer. @@ -23,13 +23,13 @@ public EndRunException(String message, boolean endNow) { this.endNow = endNow; } - public EndRunException(String anomalyDetectorId, String message, boolean endNow) { - super(anomalyDetectorId, message); + public EndRunException(String configId, String message, boolean endNow) { + super(configId, message); this.endNow = endNow; } - public EndRunException(String anomalyDetectorId, String message, Throwable throwable, boolean endNow) { - super(anomalyDetectorId, message, throwable); + public EndRunException(String configId, String message, Throwable throwable, boolean endNow) { + super(configId, message, throwable); this.endNow = endNow; } diff --git a/src/main/java/org/opensearch/timeseries/common/exception/InternalFailure.java b/src/main/java/org/opensearch/timeseries/common/exception/InternalFailure.java new file mode 100644 index 000000000..c7c9048cb --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/common/exception/InternalFailure.java @@ -0,0 +1,35 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.timeseries.common.exception; + +/** + * Exception for root cause unknown failure. Maybe transient. Client can continue the detector running. + * + */ +public class InternalFailure extends ClientException { + + public InternalFailure(String configId, String message) { + super(configId, message); + } + + public InternalFailure(String configId, String message, Throwable cause) { + super(configId, message, cause); + } + + public InternalFailure(String configId, Throwable cause) { + super(configId, cause); + } + + public InternalFailure(TimeSeriesException cause) { + super(cause.getConfigId(), cause); + } +} diff --git a/src/main/java/org/opensearch/ad/common/exception/LimitExceededException.java b/src/main/java/org/opensearch/timeseries/common/exception/LimitExceededException.java similarity index 61% rename from src/main/java/org/opensearch/ad/common/exception/LimitExceededException.java rename to src/main/java/org/opensearch/timeseries/common/exception/LimitExceededException.java index d0d230b4a..e51a2bc4e 100644 --- a/src/main/java/org/opensearch/ad/common/exception/LimitExceededException.java +++ b/src/main/java/org/opensearch/timeseries/common/exception/LimitExceededException.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.common.exception; +package org.opensearch.timeseries.common.exception; /** * This exception is thrown when a user/system limit is exceeded. @@ -17,13 +17,13 @@ public class LimitExceededException extends EndRunException { /** - * Constructor with an anomaly detector ID and an explanation. + * Constructor with a config ID and an explanation. * - * @param anomalyDetectorId ID of the anomaly detector for which the limit is exceeded + * @param id ID of the time series analysis for which the limit is exceeded * @param message explanation for the limit */ - public LimitExceededException(String anomalyDetectorId, String message) { - super(anomalyDetectorId, message, true); + public LimitExceededException(String id, String message) { + super(id, message, true); this.countedInStats(false); } @@ -47,14 +47,14 @@ public LimitExceededException(String message, boolean endRun) { } /** - * Constructor with an anomaly detector ID and an explanation, and a flag for stopping. + * Constructor with a config ID and an explanation, and a flag for stopping. * - * @param anomalyDetectorId ID of the anomaly detector for which the limit is exceeded + * @param id ID of the time series analysis for which the limit is exceeded * @param message explanation for the limit - * @param stopNow whether to stop detector immediately + * @param stopNow whether to stop time series analysis immediately */ - public LimitExceededException(String anomalyDetectorId, String message, boolean stopNow) { - super(anomalyDetectorId, message, stopNow); + public LimitExceededException(String id, String message, boolean stopNow) { + super(id, message, stopNow); this.countedInStats(false); } } diff --git a/src/main/java/org/opensearch/ad/common/exception/NotSerializedADExceptionName.java b/src/main/java/org/opensearch/timeseries/common/exception/NotSerializedExceptionName.java similarity index 66% rename from src/main/java/org/opensearch/ad/common/exception/NotSerializedADExceptionName.java rename to src/main/java/org/opensearch/timeseries/common/exception/NotSerializedExceptionName.java index d1e7d76c6..b85005e01 100644 --- a/src/main/java/org/opensearch/ad/common/exception/NotSerializedADExceptionName.java +++ b/src/main/java/org/opensearch/timeseries/common/exception/NotSerializedExceptionName.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.common.exception; +package org.opensearch.timeseries.common.exception; import static org.opensearch.OpenSearchException.getExceptionName; @@ -28,23 +28,23 @@ * check its root cause message. * */ -public enum NotSerializedADExceptionName { +public enum NotSerializedExceptionName { RESOURCE_NOT_FOUND_EXCEPTION_NAME_UNDERSCORE(getExceptionName(new ResourceNotFoundException("", ""))), LIMIT_EXCEEDED_EXCEPTION_NAME_UNDERSCORE(getExceptionName(new LimitExceededException("", "", false))), END_RUN_EXCEPTION_NAME_UNDERSCORE(getExceptionName(new EndRunException("", "", false))), - ANOMALY_DETECTION_EXCEPTION_NAME_UNDERSCORE(getExceptionName(new AnomalyDetectionException("", ""))), + TIME_SERIES_DETECTION_EXCEPTION_NAME_UNDERSCORE(getExceptionName(new TimeSeriesException("", ""))), INTERNAL_FAILURE_NAME_UNDERSCORE(getExceptionName(new InternalFailure("", ""))), CLIENT_EXCEPTION_NAME_UNDERSCORE(getExceptionName(new ClientException("", ""))), - CANCELLATION_EXCEPTION_NAME_UNDERSCORE(getExceptionName(new ADTaskCancelledException("", ""))), + CANCELLATION_EXCEPTION_NAME_UNDERSCORE(getExceptionName(new TaskCancelledException("", ""))), DUPLICATE_TASK_EXCEPTION_NAME_UNDERSCORE(getExceptionName(new DuplicateTaskException(""))), - AD_VERSION_EXCEPTION_NAME_UNDERSCORE(getExceptionName(new ADVersionException(""))), - AD_VALIDATION_EXCEPTION_NAME_UNDERSCORE(getExceptionName(new ADValidationException("", null, null))); + VERSION_EXCEPTION_NAME_UNDERSCORE(getExceptionName(new VersionException(""))), + VALIDATION_EXCEPTION_NAME_UNDERSCORE(getExceptionName(new ValidationException("", null, null))); - private static final Logger LOG = LogManager.getLogger(NotSerializedADExceptionName.class); + private static final Logger LOG = LogManager.getLogger(NotSerializedExceptionName.class); private final String name; - NotSerializedADExceptionName(String name) { + NotSerializedExceptionName(String name) { this.name = name; } @@ -53,55 +53,55 @@ public String getName() { } /** - * Convert from a NotSerializableExceptionWrapper to an AnomalyDetectionException. + * Convert from a NotSerializableExceptionWrapper to a TimeSeriesException. * Since NotSerializableExceptionWrapper does not keep some details we need, we * initialize the exception with default values. * @param exception an NotSerializableExceptionWrapper exception. - * @param adID Detector Id. - * @return converted AnomalyDetectionException + * @param configID Config Id. + * @return converted TimeSeriesException */ - public static Optional convertWrappedAnomalyDetectionException( + public static Optional convertWrappedTimeSeriesException( NotSerializableExceptionWrapper exception, - String adID + String configID ) { String exceptionMsg = exception.getMessage().trim(); - AnomalyDetectionException convertedException = null; - for (NotSerializedADExceptionName adException : values()) { - if (exceptionMsg.startsWith(adException.getName())) { - switch (adException) { + TimeSeriesException convertedException = null; + for (NotSerializedExceptionName timeseriesException : values()) { + if (exceptionMsg.startsWith(timeseriesException.getName())) { + switch (timeseriesException) { case RESOURCE_NOT_FOUND_EXCEPTION_NAME_UNDERSCORE: - convertedException = new ResourceNotFoundException(adID, exceptionMsg); + convertedException = new ResourceNotFoundException(configID, exceptionMsg); break; case LIMIT_EXCEEDED_EXCEPTION_NAME_UNDERSCORE: - convertedException = new LimitExceededException(adID, exceptionMsg, false); + convertedException = new LimitExceededException(configID, exceptionMsg, false); break; case END_RUN_EXCEPTION_NAME_UNDERSCORE: - convertedException = new EndRunException(adID, exceptionMsg, false); + convertedException = new EndRunException(configID, exceptionMsg, false); break; - case ANOMALY_DETECTION_EXCEPTION_NAME_UNDERSCORE: - convertedException = new AnomalyDetectionException(adID, exceptionMsg); + case TIME_SERIES_DETECTION_EXCEPTION_NAME_UNDERSCORE: + convertedException = new TimeSeriesException(configID, exceptionMsg); break; case INTERNAL_FAILURE_NAME_UNDERSCORE: - convertedException = new InternalFailure(adID, exceptionMsg); + convertedException = new InternalFailure(configID, exceptionMsg); break; case CLIENT_EXCEPTION_NAME_UNDERSCORE: - convertedException = new ClientException(adID, exceptionMsg); + convertedException = new ClientException(configID, exceptionMsg); break; case CANCELLATION_EXCEPTION_NAME_UNDERSCORE: - convertedException = new ADTaskCancelledException(exceptionMsg, ""); + convertedException = new TaskCancelledException(exceptionMsg, ""); break; case DUPLICATE_TASK_EXCEPTION_NAME_UNDERSCORE: convertedException = new DuplicateTaskException(exceptionMsg); break; - case AD_VERSION_EXCEPTION_NAME_UNDERSCORE: - convertedException = new ADVersionException(exceptionMsg); + case VERSION_EXCEPTION_NAME_UNDERSCORE: + convertedException = new VersionException(exceptionMsg); break; - case AD_VALIDATION_EXCEPTION_NAME_UNDERSCORE: - convertedException = new ADValidationException(exceptionMsg, null, null); + case VALIDATION_EXCEPTION_NAME_UNDERSCORE: + convertedException = new ValidationException(exceptionMsg, null, null); break; default: - LOG.warn(new ParameterizedMessage("Unexpected AD exception {}", adException)); + LOG.warn(new ParameterizedMessage("Unexpected exception {}", timeseriesException)); break; } } diff --git a/src/main/java/org/opensearch/ad/common/exception/ResourceNotFoundException.java b/src/main/java/org/opensearch/timeseries/common/exception/ResourceNotFoundException.java similarity index 62% rename from src/main/java/org/opensearch/ad/common/exception/ResourceNotFoundException.java rename to src/main/java/org/opensearch/timeseries/common/exception/ResourceNotFoundException.java index 450f509f7..eddbcac99 100644 --- a/src/main/java/org/opensearch/ad/common/exception/ResourceNotFoundException.java +++ b/src/main/java/org/opensearch/timeseries/common/exception/ResourceNotFoundException.java @@ -9,21 +9,21 @@ * GitHub history for details. */ -package org.opensearch.ad.common.exception; +package org.opensearch.timeseries.common.exception; /** * This exception is thrown when a resource is not found. */ -public class ResourceNotFoundException extends AnomalyDetectionException { +public class ResourceNotFoundException extends TimeSeriesException { /** - * Constructor with an anomaly detector ID and a message. + * Constructor with a config ID and a message. * - * @param detectorId ID of the detector related to the resource + * @param configId ID of the config related to the resource * @param message explains which resource is not found */ - public ResourceNotFoundException(String detectorId, String message) { - super(detectorId, message); + public ResourceNotFoundException(String configId, String message) { + super(configId, message); countedInStats(false); } diff --git a/src/main/java/org/opensearch/ad/common/exception/ADTaskCancelledException.java b/src/main/java/org/opensearch/timeseries/common/exception/TaskCancelledException.java similarity index 73% rename from src/main/java/org/opensearch/ad/common/exception/ADTaskCancelledException.java rename to src/main/java/org/opensearch/timeseries/common/exception/TaskCancelledException.java index f981c2f79..ba0c3d600 100644 --- a/src/main/java/org/opensearch/ad/common/exception/ADTaskCancelledException.java +++ b/src/main/java/org/opensearch/timeseries/common/exception/TaskCancelledException.java @@ -9,12 +9,12 @@ * GitHub history for details. */ -package org.opensearch.ad.common.exception; +package org.opensearch.timeseries.common.exception; -public class ADTaskCancelledException extends AnomalyDetectionException { +public class TaskCancelledException extends TimeSeriesException { private String cancelledBy; - public ADTaskCancelledException(String msg, String user) { + public TaskCancelledException(String msg, String user) { super(msg); this.cancelledBy = user; this.countedInStats(false); diff --git a/src/main/java/org/opensearch/ad/common/exception/AnomalyDetectionException.java b/src/main/java/org/opensearch/timeseries/common/exception/TimeSeriesException.java similarity index 55% rename from src/main/java/org/opensearch/ad/common/exception/AnomalyDetectionException.java rename to src/main/java/org/opensearch/timeseries/common/exception/TimeSeriesException.java index 882ab2530..caa2573a9 100644 --- a/src/main/java/org/opensearch/ad/common/exception/AnomalyDetectionException.java +++ b/src/main/java/org/opensearch/timeseries/common/exception/TimeSeriesException.java @@ -9,54 +9,54 @@ * GitHub history for details. */ -package org.opensearch.ad.common.exception; +package org.opensearch.timeseries.common.exception; /** - * Base exception for exceptions thrown from Anomaly Detection. + * Base exception for exceptions thrown. */ -public class AnomalyDetectionException extends RuntimeException { +public class TimeSeriesException extends RuntimeException { - private String anomalyDetectorId; + private String configId; // countedInStats will be used to tell whether the exception should be // counted in failure stats. private boolean countedInStats = true; - public AnomalyDetectionException(String message) { + public TimeSeriesException(String message) { super(message); } /** - * Constructor with an anomaly detector ID and a message. + * Constructor with a config ID and a message. * - * @param anomalyDetectorId anomaly detector ID + * @param configId config ID * @param message message of the exception */ - public AnomalyDetectionException(String anomalyDetectorId, String message) { + public TimeSeriesException(String configId, String message) { super(message); - this.anomalyDetectorId = anomalyDetectorId; + this.configId = configId; } - public AnomalyDetectionException(String adID, String message, Throwable cause) { + public TimeSeriesException(String configID, String message, Throwable cause) { super(message, cause); - this.anomalyDetectorId = adID; + this.configId = configID; } - public AnomalyDetectionException(Throwable cause) { + public TimeSeriesException(Throwable cause) { super(cause); } - public AnomalyDetectionException(String adID, Throwable cause) { + public TimeSeriesException(String configID, Throwable cause) { super(cause); - this.anomalyDetectorId = adID; + this.configId = configID; } /** - * Returns the ID of the anomaly detector. + * Returns the ID of the analysis config. * - * @return anomaly detector ID + * @return config ID */ - public String getAnomalyDetectorId() { - return this.anomalyDetectorId; + public String getConfigId() { + return this.configId; } /** @@ -74,7 +74,7 @@ public boolean isCountedInStats() { * @param countInStats count the exception in stats * @return the exception itself */ - public AnomalyDetectionException countedInStats(boolean countInStats) { + public TimeSeriesException countedInStats(boolean countInStats) { this.countedInStats = countInStats; return this; } @@ -82,8 +82,7 @@ public AnomalyDetectionException countedInStats(boolean countInStats) { @Override public String toString() { StringBuilder sb = new StringBuilder(); - sb.append("Anomaly Detector "); - sb.append(anomalyDetectorId); + sb.append(configId); sb.append(' '); sb.append(super.toString()); return sb.toString(); diff --git a/src/main/java/org/opensearch/ad/common/exception/ADValidationException.java b/src/main/java/org/opensearch/timeseries/common/exception/ValidationException.java similarity index 69% rename from src/main/java/org/opensearch/ad/common/exception/ADValidationException.java rename to src/main/java/org/opensearch/timeseries/common/exception/ValidationException.java index 6b068070b..4c18c13fe 100644 --- a/src/main/java/org/opensearch/ad/common/exception/ADValidationException.java +++ b/src/main/java/org/opensearch/timeseries/common/exception/ValidationException.java @@ -9,19 +9,19 @@ * GitHub history for details. */ -package org.opensearch.ad.common.exception; +package org.opensearch.timeseries.common.exception; -import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.DetectorValidationIssueType; -import org.opensearch.ad.model.IntervalTimeConfiguration; -import org.opensearch.ad.model.ValidationAspect; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.ValidationAspect; +import org.opensearch.timeseries.model.ValidationIssueType; -public class ADValidationException extends AnomalyDetectionException { - private final DetectorValidationIssueType type; +public class ValidationException extends TimeSeriesException { + private final ValidationIssueType type; private final ValidationAspect aspect; private final IntervalTimeConfiguration intervalSuggestion; - public DetectorValidationIssueType getType() { + public ValidationIssueType getType() { return type; } @@ -33,27 +33,27 @@ public IntervalTimeConfiguration getIntervalSuggestion() { return intervalSuggestion; } - public ADValidationException(String message, DetectorValidationIssueType type, ValidationAspect aspect) { + public ValidationException(String message, ValidationIssueType type, ValidationAspect aspect) { this(message, null, type, aspect, null); } - public ADValidationException( + public ValidationException( String message, - DetectorValidationIssueType type, + ValidationIssueType type, ValidationAspect aspect, IntervalTimeConfiguration intervalSuggestion ) { this(message, null, type, aspect, intervalSuggestion); } - public ADValidationException( + public ValidationException( String message, Throwable cause, - DetectorValidationIssueType type, + ValidationIssueType type, ValidationAspect aspect, IntervalTimeConfiguration intervalSuggestion ) { - super(AnomalyDetector.NO_ID, message, cause); + super(Config.NO_ID, message, cause); this.type = type; this.aspect = aspect; this.intervalSuggestion = intervalSuggestion; diff --git a/src/main/java/org/opensearch/ad/common/exception/ADVersionException.java b/src/main/java/org/opensearch/timeseries/common/exception/VersionException.java similarity index 57% rename from src/main/java/org/opensearch/ad/common/exception/ADVersionException.java rename to src/main/java/org/opensearch/timeseries/common/exception/VersionException.java index 3811a9980..b9fac314c 100644 --- a/src/main/java/org/opensearch/ad/common/exception/ADVersionException.java +++ b/src/main/java/org/opensearch/timeseries/common/exception/VersionException.java @@ -9,18 +9,18 @@ * GitHub history for details. */ -package org.opensearch.ad.common.exception; +package org.opensearch.timeseries.common.exception; /** * AD version incompatible exception. */ -public class ADVersionException extends AnomalyDetectionException { +public class VersionException extends TimeSeriesException { - public ADVersionException(String message) { + public VersionException(String message) { super(message); } - public ADVersionException(String anomalyDetectorId, String message) { - super(anomalyDetectorId, message); + public VersionException(String configId, String message) { + super(configId, message); } } diff --git a/src/main/java/org/opensearch/timeseries/constant/CommonMessages.java b/src/main/java/org/opensearch/timeseries/constant/CommonMessages.java new file mode 100644 index 000000000..0576f9693 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/constant/CommonMessages.java @@ -0,0 +1,142 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.timeseries.constant; + +import java.util.Locale; + +public class CommonMessages { + // ====================================== + // Validation message + // ====================================== + public static String NEGATIVE_TIME_CONFIGURATION = "should be non-negative"; + public static String INVALID_RESULT_INDEX_MAPPING = "Result index mapping is not correct for index: "; + public static String EMPTY_NAME = "name should be set"; + public static String NULL_TIME_FIELD = "Time field should be set"; + public static String EMPTY_INDICES = "Indices should be set"; + + public static String getTooManyCategoricalFieldErr(int limit) { + return String.format(Locale.ROOT, CommonMessages.TOO_MANY_CATEGORICAL_FIELD_ERR_MSG_FORMAT, limit); + } + + public static final String TOO_MANY_CATEGORICAL_FIELD_ERR_MSG_FORMAT = + "Currently we only support up to %d categorical field/s in order to bound system resource consumption."; + + public static String CAN_NOT_FIND_RESULT_INDEX = "Can't find result index "; + public static String INVALID_CHAR_IN_RESULT_INDEX_NAME = + "Result index name has invalid character. Valid characters are a-z, 0-9, -(hyphen) and _(underscore)"; + public static String FAIL_TO_VALIDATE = "failed to validate"; + public static String INVALID_TIMESTAMP = "Timestamp field: (%s) must be of type date"; + public static String NON_EXISTENT_TIMESTAMP = "Timestamp field: (%s) is not found in index mapping"; + public static String INVALID_NAME = "Valid characters for name are a-z, A-Z, 0-9, -(hyphen), _(underscore) and .(period)"; + // change this error message to make it compatible with old version's integration(nexus) test + public static String FAIL_TO_FIND_CONFIG_MSG = "Can't find config with id: "; + public static final String CAN_NOT_CHANGE_CATEGORY_FIELD = "Can't change category field"; + public static final String CAN_NOT_CHANGE_CUSTOM_RESULT_INDEX = "Can't change custom result index"; + public static final String CATEGORICAL_FIELD_TYPE_ERR_MSG = "A categorical field must be of type keyword or ip."; + // Modifying message for FEATURE below may break the parseADValidationException method of ValidateAnomalyDetectorTransportAction + public static final String FEATURE_INVALID_MSG_PREFIX = "Feature has an invalid query"; + public static final String FEATURE_WITH_EMPTY_DATA_MSG = FEATURE_INVALID_MSG_PREFIX + " returning empty aggregated data: "; + public static final String FEATURE_WITH_INVALID_QUERY_MSG = FEATURE_INVALID_MSG_PREFIX + " causing a runtime exception: "; + public static final String UNKNOWN_SEARCH_QUERY_EXCEPTION_MSG = + "Feature has an unknown exception caught while executing the feature query: "; + public static String DUPLICATE_FEATURE_AGGREGATION_NAMES = "Config has duplicate feature aggregation query names: "; + public static String TIME_FIELD_NOT_ENOUGH_HISTORICAL_DATA = + "There isn't enough historical data found with current timefield selected."; + public static String CATEGORY_FIELD_TOO_SPARSE = + "Data is most likely too sparse with the given category fields. Consider revising category field/s or ingesting more data "; + public static String WINDOW_DELAY_REC = + "Latest seen data point is at least %d minutes ago, consider changing window delay to at least %d minutes."; + public static String INTERVAL_REC = "The selected interval might collect sparse data. Consider changing interval length to: "; + public static String RAW_DATA_TOO_SPARSE = + "Source index data is potentially too sparse for model training. Consider changing interval length or ingesting more data"; + public static String MODEL_VALIDATION_FAILED_UNEXPECTEDLY = "Model validation experienced issues completing."; + public static String FILTER_QUERY_TOO_SPARSE = "Data is too sparse after data filter is applied. Consider changing the data filter"; + public static String CATEGORY_FIELD_NO_DATA = + "No entity was found with the given categorical fields. Consider revising category field/s or ingesting more data"; + public static String FEATURE_QUERY_TOO_SPARSE = + "Data is most likely too sparse when given feature queries are applied. Consider revising feature queries."; + public static String TIMEOUT_ON_INTERVAL_REC = "Timed out getting interval recommendation"; + + // ====================================== + // Index message + // ====================================== + public static final String CREATE_INDEX_NOT_ACKNOWLEDGED = "Create index %S not acknowledged by OpenSearch core"; + public static final String SUCCESS_SAVING_RESULT_MSG = "Result saved successfully."; + public static final String CANNOT_SAVE_RESULT_ERR_MSG = "Cannot save results due to write block."; + + // ====================================== + // Resource constraints + // ====================================== + public static final String MEMORY_CIRCUIT_BROKEN_ERR_MSG = + "The total OpenSearch memory usage exceeds our threshold, opening the AD memory circuit."; + + // ====================================== + // Transport + // ====================================== + public static final String INVALID_TIMESTAMP_ERR_MSG = "timestamp is invalid"; + public static String FAIL_TO_DELETE_CONFIG = "Fail to delete config"; + public static String FAIL_TO_GET_CONFIG_INFO = "Fail to get config info"; + + // ====================================== + // transport/restful client + // ====================================== + public static final String WAIT_ERR_MSG = "Exception in waiting for result"; + public static final String ALL_FEATURES_DISABLED_ERR_MSG = + "Having trouble querying data because all of your features have been disabled."; + // We need this invalid query tag to show proper error message on frontend + // refer to AD Dashboard code: https://tinyurl.com/8b5n8hat + public static final String INVALID_SEARCH_QUERY_MSG = "Invalid search query."; + public static final String NO_REQUESTS_ADDED_ERR = "no requests added"; + + // ====================================== + // rate limiting worker + // ====================================== + public static final String BUG_RESPONSE = "We might have bugs."; + public static final String MEMORY_LIMIT_EXCEEDED_ERR_MSG = "Models memory usage exceeds our limit."; + + // ====================================== + // security + // ====================================== + public static String NO_PERMISSION_TO_ACCESS_CONFIG = "User does not have permissions to access config: "; + public static String FAIL_TO_GET_USER_INFO = "Unable to get user information from config "; + + // ====================================== + // transport + // ====================================== + public static final String CONFIG_ID_MISSING_MSG = "config ID is missing"; + public static final String MODEL_ID_MISSING_MSG = "model ID is missing"; + + // ====================================== + // task + // ====================================== + public static String CAN_NOT_FIND_LATEST_TASK = "can't find latest task"; + + // ====================================== + // Job + // ====================================== + public static String CONFIG_IS_RUNNING = "Config is already running"; + public static String FAIL_TO_SEARCH = "Fail to search"; + + // ====================================== + // Profile API + // ====================================== + public static String EMPTY_PROFILES_COLLECT = "profiles to collect are missing or invalid"; + public static String FAIL_TO_PARSE_CONFIG_MSG = "Fail to parse config with id: "; + public static String FAIL_FETCH_ERR_MSG = "Fail to fetch profile for "; + public static String FAIL_TO_GET_PROFILE_MSG = "Fail to get profile for config "; + public static String FAIL_TO_GET_TOTAL_ENTITIES = "Failed to get total entities for config "; + + // ====================================== + // Stats API + // ====================================== + public static String FAIL_TO_GET_STATS = "Fail to get stats"; +} diff --git a/src/main/java/org/opensearch/timeseries/constant/CommonName.java b/src/main/java/org/opensearch/timeseries/constant/CommonName.java new file mode 100644 index 000000000..0b997ea5d --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/constant/CommonName.java @@ -0,0 +1,116 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.timeseries.constant; + +public class CommonName { + + // ====================================== + // Index mapping + // ====================================== + // Elastic mapping type + public static final String MAPPING_TYPE = "_doc"; + // used for updating mapping + public static final String SCHEMA_VERSION_FIELD = "schema_version"; + + // Used to fetch mapping + public static final String TYPE = "type"; + public static final String KEYWORD_TYPE = "keyword"; + public static final String IP_TYPE = "ip"; + public static final String DATE_TYPE = "date"; + + // ====================================== + // Index name + // ====================================== + // config index. We are reusing ad detector index. + public static final String CONFIG_INDEX = ".opendistro-anomaly-detectors"; + + // job index. We are reusing ad job index. + public static final String JOB_INDEX = ".opendistro-anomaly-detector-jobs"; + + // ====================================== + // Validation + // ====================================== + public static final String MODEL_ASPECT = "model"; + public static final String CONFIG_ID_MISSING_MSG = "config ID is missing"; + + // ====================================== + // Used for custom forecast result index + // ====================================== + public static final String PROPERTIES = "properties"; + + // ====================================== + // Used in toXContent + // ====================================== + public static final String START_JSON_KEY = "start"; + public static final String END_JSON_KEY = "end"; + public static final String ENTITIES_JSON_KEY = "entities"; + public static final String ENTITY_KEY = "entity"; + public static final String VALUE_JSON_KEY = "value"; + public static final String VALUE_LIST_FIELD = "value_list"; + public static final String FEATURE_DATA_FIELD = "feature_data"; + public static final String DATA_START_TIME_FIELD = "data_start_time"; + public static final String DATA_END_TIME_FIELD = "data_end_time"; + public static final String EXECUTION_START_TIME_FIELD = "execution_start_time"; + public static final String EXECUTION_END_TIME_FIELD = "execution_end_time"; + public static final String ERROR_FIELD = "error"; + public static final String ENTITY_FIELD = "entity"; + public static final String USER_FIELD = "user"; + public static final String CONFIDENCE_FIELD = "confidence"; + public static final String DATA_QUALITY_FIELD = "data_quality"; + // MODEL_ID_FIELD can be used in profile and stats API as well + public static final String MODEL_ID_FIELD = "model_id"; + public static final String TIMESTAMP = "timestamp"; + public static final String FIELD_MODEL = "model"; + + // entity sample in checkpoint. + // kept for bwc purpose + public static final String ENTITY_SAMPLE = "sp"; + // current key for entity samples + public static final String ENTITY_SAMPLE_QUEUE = "samples"; + + // ====================================== + // Profile name + // ====================================== + public static final String MODEL_SIZE_IN_BYTES = "model_size_in_bytes"; + + // ====================================== + // Used for backward-compatibility in messaging + // ====================================== + public static final String EMPTY_FIELD = ""; + + // ====================================== + // Query + // ====================================== + // Used in finding the max timestamp + public static final String AGG_NAME_MAX_TIME = "max_timefield"; + // Used in finding the min timestamp + public static final String AGG_NAME_MIN_TIME = "min_timefield"; + // date histogram aggregation name + public static final String DATE_HISTOGRAM = "date_histogram"; + // feature aggregation name + public static final String FEATURE_AGGS = "feature_aggs"; + + // ====================================== + // Used in toXContent + // ====================================== + public static final String CONFIG_ID_KEY = "config_id"; + public static final String MODEL_ID_KEY = "model_id"; + public static final String TASK_ID_FIELD = "task_id"; + public static final String ENTITY_ID_FIELD = "entity_id"; + + // ====================================== + // plugin info + // ====================================== + public static final String TIME_SERIES_PLUGIN_NAME = "opensearch-time-series-analytics"; + public static final String TIME_SERIES_PLUGIN_NAME_FOR_TEST = "org.opensearch.timeseries.TimeSeriesAnalyticsPlugin"; + public static final String TIME_SERIES_PLUGIN_VERSION_FOR_TEST = "NA"; +} diff --git a/src/main/java/org/opensearch/timeseries/constant/CommonValue.java b/src/main/java/org/opensearch/timeseries/constant/CommonValue.java new file mode 100644 index 000000000..6f05f59d0 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/constant/CommonValue.java @@ -0,0 +1,12 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.constant; + +public class CommonValue { + // unknown or no schema version + public static Integer NO_SCHEMA_VERSION = 0; + +} diff --git a/src/main/java/org/opensearch/timeseries/dataprocessor/FixedValueImputer.java b/src/main/java/org/opensearch/timeseries/dataprocessor/FixedValueImputer.java new file mode 100644 index 000000000..9b8f6bf21 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/dataprocessor/FixedValueImputer.java @@ -0,0 +1,47 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.dataprocessor; + +import java.util.Arrays; + +/** + * fixing missing value (denoted using Double.NaN) using a fixed set of specified values. + * The 2nd parameter of interpolate is ignored as we infer the number of imputed values + * using the number of Double.NaN. + */ +public class FixedValueImputer extends Imputer { + private double[] fixedValue; + + public FixedValueImputer(double[] fixedValue) { + this.fixedValue = fixedValue; + } + + /** + * Given an array of samples, fill with given value. + * We will ignore the rest of samples beyond the 2nd element. + * + * @return an imputed array of size numImputed + */ + @Override + public double[][] impute(double[][] samples, int numImputed) { + int numFeatures = samples.length; + double[][] imputed = new double[numFeatures][numImputed]; + + for (int featureIndex = 0; featureIndex < numFeatures; featureIndex++) { + imputed[featureIndex] = singleFeatureInterpolate(samples[featureIndex], numImputed, fixedValue[featureIndex]); + } + return imputed; + } + + private double[] singleFeatureInterpolate(double[] samples, int numInterpolants, double defaultVal) { + return Arrays.stream(samples).map(d -> Double.isNaN(d) ? defaultVal : d).toArray(); + } + + @Override + protected double[] singleFeatureImpute(double[] samples, int numInterpolants) { + throw new UnsupportedOperationException("The operation is not supported"); + } +} diff --git a/src/main/java/org/opensearch/timeseries/dataprocessor/ImputationMethod.java b/src/main/java/org/opensearch/timeseries/dataprocessor/ImputationMethod.java new file mode 100644 index 000000000..90494862c --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/dataprocessor/ImputationMethod.java @@ -0,0 +1,25 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.dataprocessor; + +public enum ImputationMethod { + /** + * This method replaces all missing values with 0's. It's a simple approach, but it may introduce bias if the data is not centered around zero. + */ + ZERO, + /** + * This method replaces missing values with a predefined set of values. The values are the same for each input dimension, and they need to be specified by the user. + */ + FIXED_VALUES, + /** + * This method replaces missing values with the last known value in the respective input dimension. It's a commonly used method for time series data, where temporal continuity is expected. + */ + PREVIOUS, + /** + * This method estimates missing values by interpolating linearly between known values in the respective input dimension. This method assumes that the data follows a linear trend. + */ + LINEAR +} diff --git a/src/main/java/org/opensearch/timeseries/dataprocessor/ImputationOption.java b/src/main/java/org/opensearch/timeseries/dataprocessor/ImputationOption.java new file mode 100644 index 000000000..9098aac14 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/dataprocessor/ImputationOption.java @@ -0,0 +1,147 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.dataprocessor; + +import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; + +import java.io.IOException; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.List; +import java.util.Locale; +import java.util.Objects; +import java.util.Optional; + +import org.opensearch.core.common.io.stream.StreamInput; +import org.opensearch.core.common.io.stream.StreamOutput; +import org.opensearch.core.common.io.stream.Writeable; +import org.opensearch.core.xcontent.ToXContent; +import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.core.xcontent.XContentParser; + +public class ImputationOption implements Writeable, ToXContent { + // field name in toXContent + public static final String METHOD_FIELD = "method"; + public static final String DEFAULT_FILL_FIELD = "defaultFill"; + public static final String INTEGER_SENSITIVE_FIELD = "integerSensitive"; + + private final ImputationMethod method; + private final Optional defaultFill; + private final boolean integerSentive; + + public ImputationOption(ImputationMethod method, Optional defaultFill, boolean integerSentive) { + this.method = method; + this.defaultFill = defaultFill; + this.integerSentive = integerSentive; + } + + public ImputationOption(ImputationMethod method) { + this(method, Optional.empty(), false); + } + + public ImputationOption(StreamInput in) throws IOException { + this.method = in.readEnum(ImputationMethod.class); + if (in.readBoolean()) { + this.defaultFill = Optional.of(in.readDoubleArray()); + } else { + this.defaultFill = Optional.empty(); + } + this.integerSentive = in.readBoolean(); + } + + @Override + public void writeTo(StreamOutput out) throws IOException { + out.writeEnum(method); + if (defaultFill.isEmpty()) { + out.writeBoolean(false); + } else { + out.writeBoolean(true); + out.writeDoubleArray(defaultFill.get()); + } + out.writeBoolean(integerSentive); + } + + public static ImputationOption parse(XContentParser parser) throws IOException { + ImputationMethod method = ImputationMethod.ZERO; + List defaultFill = null; + Boolean integerSensitive = null; + + ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.currentToken(), parser); + while (parser.nextToken() != XContentParser.Token.END_OBJECT) { + String fieldName = parser.currentName(); + parser.nextToken(); + switch (fieldName) { + case METHOD_FIELD: + method = ImputationMethod.valueOf(parser.text().toUpperCase(Locale.ROOT)); + break; + case DEFAULT_FILL_FIELD: + ensureExpectedToken(XContentParser.Token.START_ARRAY, parser.currentToken(), parser); + defaultFill = new ArrayList<>(); + while (parser.nextToken() != XContentParser.Token.END_ARRAY) { + defaultFill.add(parser.doubleValue()); + } + break; + case INTEGER_SENSITIVE_FIELD: + integerSensitive = parser.booleanValue(); + break; + default: + break; + } + } + return new ImputationOption( + method, + Optional.ofNullable(defaultFill).map(list -> list.stream().mapToDouble(Double::doubleValue).toArray()), + integerSensitive + ); + } + + public XContentBuilder toXContent(XContentBuilder builder) throws IOException { + return toXContent(builder, ToXContent.EMPTY_PARAMS); + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + XContentBuilder xContentBuilder = builder.startObject(); + + builder.field(METHOD_FIELD, method); + + if (!defaultFill.isEmpty()) { + builder.array(DEFAULT_FILL_FIELD, defaultFill.get()); + } + builder.field(INTEGER_SENSITIVE_FIELD, integerSentive); + return xContentBuilder.endObject(); + } + + @Override + public boolean equals(Object o) { + if (o == this) + return true; + if (o == null || getClass() != o.getClass()) + return false; + + ImputationOption other = (ImputationOption) o; + return method == other.method + && (defaultFill.isEmpty() ? other.defaultFill.isEmpty() : Arrays.equals(defaultFill.get(), other.defaultFill.get())) + && integerSentive == other.integerSentive; + } + + @Override + public int hashCode() { + return Objects.hash(method, (defaultFill.isEmpty() ? 0 : Arrays.hashCode(defaultFill.get())), integerSentive); + } + + public ImputationMethod getMethod() { + return method; + } + + public Optional getDefaultFill() { + return defaultFill; + } + + public boolean isIntegerSentive() { + return integerSentive; + } +} diff --git a/src/main/java/org/opensearch/timeseries/dataprocessor/Imputer.java b/src/main/java/org/opensearch/timeseries/dataprocessor/Imputer.java new file mode 100644 index 000000000..4e885421c --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/dataprocessor/Imputer.java @@ -0,0 +1,48 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.dataprocessor; + +/* + * An object for imputing feature vectors. + * + * In certain situations, due to time and compute cost, we are only allowed to + * query a sparse sample of data points / feature vectors from a cluster. + * However, we need a large sample of feature vectors in order to train our + * anomaly detection algorithms. An Imputer approximates the data points + * between a given, ordered list of samples. + */ +public abstract class Imputer { + + /** + * Imputes the given sample feature vectors. + * + * Computes a list `numImputed` feature vectors using the ordered list + * of `numSamples` input sample vectors where each sample vector has size + * `numFeatures`. + * + * + * @param samples A `numFeatures x numSamples` list of feature vectors. + * @param numImputed The desired number of imputed vectors. + * @return A `numFeatures x numImputed` list of feature vectors. + */ + public double[][] impute(double[][] samples, int numImputed) { + int numFeatures = samples.length; + double[][] interpolants = new double[numFeatures][numImputed]; + + for (int featureIndex = 0; featureIndex < numFeatures; featureIndex++) { + interpolants[featureIndex] = singleFeatureImpute(samples[featureIndex], numImputed); + } + return interpolants; + } + + /** + * compute per-feature impute value + * @param samples input array + * @param numImputed number of elements in the return array + * @return input array with missing values imputed + */ + protected abstract double[] singleFeatureImpute(double[] samples, int numImputed); +} diff --git a/src/main/java/org/opensearch/timeseries/dataprocessor/LinearUniformImputer.java b/src/main/java/org/opensearch/timeseries/dataprocessor/LinearUniformImputer.java new file mode 100644 index 000000000..2fa3fd651 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/dataprocessor/LinearUniformImputer.java @@ -0,0 +1,82 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.dataprocessor; + +import java.util.Arrays; + +import com.google.common.math.DoubleMath; + +/** + * A piecewise linear imputer with uniformly spaced points. + * + * The LinearUniformImputer constructs a piecewise linear imputation on + * the input list of sample feature vectors. That is, between every consecutive + * pair of points we construct a linear imputation. The linear imputation + * is computed on a per-feature basis. + * + */ +public class LinearUniformImputer extends Imputer { + // if true, impute integral/floating-point results: when all samples are integral, + // the results are integral. Else, the results are floating points. + private boolean integerSensitive; + + public LinearUniformImputer(boolean integerSensitive) { + this.integerSensitive = integerSensitive; + } + + /* + * Piecewise linearly impute the given sample of one-dimensional + * features. + * + * Computes a list `numImputed` features using the ordered list of + * `numSamples` input one-dimensional samples. The imputed features are + * computing using a piecewise linear imputation. + * + * @param samples A `numSamples` sized list of sample features. + * @param numImputed The desired number of imputed features. + * @return A `numImputed` sized array of imputed features. + */ + @Override + public double[] singleFeatureImpute(double[] samples, int numImputed) { + int numSamples = samples.length; + double[] imputedValues = new double[numImputed]; + + if (numSamples == 0) { + imputedValues = new double[0]; + } else if (numSamples == 1) { + Arrays.fill(imputedValues, samples[0]); + } else { + /* assume the piecewise linear imputation between the samples is a + parameterized curve f(t) for t in [0, 1]. Each pair of samples + determines a interval [t_i, t_(i+1)]. For each imputed value we + determine which interval it lies inside and then scale the value of t, + accordingly to compute the imputed value. + + for numerical stability reasons we omit processing the final + imputed value in this loop since this last imputed value is always equal + to the last sample. + */ + for (int imputedIndex = 0; imputedIndex < (numImputed - 1); imputedIndex++) { + double tGlobal = (imputedIndex) / (numImputed - 1.0); + double tInterval = tGlobal * (numSamples - 1.0); + int intervalIndex = (int) Math.floor(tInterval); + tInterval -= intervalIndex; + + double leftSample = samples[intervalIndex]; + double rightSample = samples[intervalIndex + 1]; + double imputed = (1.0 - tInterval) * leftSample + tInterval * rightSample; + imputedValues[imputedIndex] = imputed; + } + + // the final imputed value is always the final sample + imputedValues[numImputed - 1] = samples[numSamples - 1]; + } + if (integerSensitive && Arrays.stream(samples).allMatch(DoubleMath::isMathematicalInteger)) { + imputedValues = Arrays.stream(imputedValues).map(Math::rint).toArray(); + } + return imputedValues; + } +} diff --git a/src/main/java/org/opensearch/timeseries/dataprocessor/PreviousValueImputer.java b/src/main/java/org/opensearch/timeseries/dataprocessor/PreviousValueImputer.java new file mode 100644 index 000000000..e91c90814 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/dataprocessor/PreviousValueImputer.java @@ -0,0 +1,42 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.dataprocessor; + +/** + * Given an array of samples, fill missing values (represented using Double.NaN) + * with previous value. + * The return array may be smaller than the input array as we remove leading missing + * values after interpolation. If the first sample is Double.NaN + * as there is no last known value to fill in. + * The 2nd parameter of interpolate is ignored as we infer the number of imputed values + * using the number of Double.NaN. + * + */ +public class PreviousValueImputer extends Imputer { + + @Override + protected double[] singleFeatureImpute(double[] samples, int numInterpolants) { + int numSamples = samples.length; + double[] interpolants = new double[numSamples]; + + if (numSamples > 0) { + System.arraycopy(samples, 0, interpolants, 0, samples.length); + if (numSamples > 1) { + double lastKnownValue = Double.NaN; + for (int interpolantIndex = 0; interpolantIndex < numSamples; interpolantIndex++) { + if (Double.isNaN(interpolants[interpolantIndex])) { + if (!Double.isNaN(lastKnownValue)) { + interpolants[interpolantIndex] = lastKnownValue; + } + } else { + lastKnownValue = interpolants[interpolantIndex]; + } + } + } + } + return interpolants; + } +} diff --git a/src/main/java/org/opensearch/timeseries/dataprocessor/ZeroImputer.java b/src/main/java/org/opensearch/timeseries/dataprocessor/ZeroImputer.java new file mode 100644 index 000000000..1d0656de1 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/dataprocessor/ZeroImputer.java @@ -0,0 +1,21 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.dataprocessor; + +import java.util.Arrays; + +/** + * fixing missing value (denoted using Double.NaN) using 0. + * The 2nd parameter of impute is ignored as we infer the number + * of imputed values using the number of Double.NaN. + */ +public class ZeroImputer extends Imputer { + + @Override + public double[] singleFeatureImpute(double[] samples, int numInterpolants) { + return Arrays.stream(samples).map(d -> Double.isNaN(d) ? 0.0 : d).toArray(); + } +} diff --git a/src/main/java/org/opensearch/ad/feature/SearchFeatureDao.java b/src/main/java/org/opensearch/timeseries/feature/SearchFeatureDao.java similarity index 90% rename from src/main/java/org/opensearch/ad/feature/SearchFeatureDao.java rename to src/main/java/org/opensearch/timeseries/feature/SearchFeatureDao.java index 6218fa748..1ce44472f 100644 --- a/src/main/java/org/opensearch/ad/feature/SearchFeatureDao.java +++ b/src/main/java/org/opensearch/timeseries/feature/SearchFeatureDao.java @@ -9,14 +9,13 @@ * GitHub history for details. */ -package org.opensearch.ad.feature; +package org.opensearch.timeseries.feature; import static org.apache.commons.math3.linear.MatrixUtils.createRealMatrix; -import static org.opensearch.ad.constant.CommonName.DATE_HISTOGRAM; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_PAGE_SIZE; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_ENTITIES_FOR_PREVIEW; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.PAGE_SIZE; import static org.opensearch.ad.settings.AnomalyDetectorSettings.PREVIEW_TIMEOUT_IN_MILLIS; -import static org.opensearch.ad.util.ParseUtils.batchFeatureQuery; +import static org.opensearch.timeseries.util.ParseUtils.batchFeatureQuery; import java.io.IOException; import java.time.Clock; @@ -39,14 +38,8 @@ import org.apache.logging.log4j.Logger; import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.dataprocessor.Interpolator; +import org.opensearch.ad.feature.AbstractRetriever; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.IntervalTimeConfiguration; -import org.opensearch.ad.util.ParseUtils; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Settings; @@ -72,6 +65,15 @@ import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.search.sort.FieldSortBuilder; import org.opensearch.search.sort.SortOrder; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.dataprocessor.Imputer; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.util.ParseUtils; +import org.opensearch.timeseries.util.SecurityClientUtil; /** * DAO for features from search. @@ -86,7 +88,7 @@ public class SearchFeatureDao extends AbstractRetriever { // Dependencies private final Client client; private final NamedXContentRegistry xContent; - private final Interpolator interpolator; + private final Imputer imputer; private final SecurityClientUtil clientUtil; private volatile int maxEntitiesForPreview; private volatile int pageSize; @@ -98,7 +100,7 @@ public class SearchFeatureDao extends AbstractRetriever { public SearchFeatureDao( Client client, NamedXContentRegistry xContent, - Interpolator interpolator, + Imputer imputer, SecurityClientUtil clientUtil, Settings settings, ClusterService clusterService, @@ -110,7 +112,7 @@ public SearchFeatureDao( ) { this.client = client; this.xContent = xContent; - this.interpolator = interpolator; + this.imputer = imputer; this.clientUtil = clientUtil; this.maxEntitiesForPreview = maxEntitiesForPreview; @@ -118,7 +120,7 @@ public SearchFeatureDao( if (clusterService != null) { clusterService.getClusterSettings().addSettingsUpdateConsumer(MAX_ENTITIES_FOR_PREVIEW, it -> this.maxEntitiesForPreview = it); - clusterService.getClusterSettings().addSettingsUpdateConsumer(PAGE_SIZE, it -> this.pageSize = it); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_PAGE_SIZE, it -> this.pageSize = it); } this.minimumDocCountForPreview = minimumDocCount; this.previewTimeoutInMilliseconds = previewTimeoutInMilliseconds; @@ -130,7 +132,7 @@ public SearchFeatureDao( * * @param client ES client for queries * @param xContent ES XContentRegistry - * @param interpolator interpolator for missing values + * @param imputer imputer for missing values * @param clientUtil utility for ES client * @param settings ES settings * @param clusterService ES ClusterService @@ -140,7 +142,7 @@ public SearchFeatureDao( public SearchFeatureDao( Client client, NamedXContentRegistry xContent, - Interpolator interpolator, + Imputer imputer, SecurityClientUtil clientUtil, Settings settings, ClusterService clusterService, @@ -149,14 +151,14 @@ public SearchFeatureDao( this( client, xContent, - interpolator, + imputer, clientUtil, settings, clusterService, minimumDocCount, Clock.systemUTC(), MAX_ENTITIES_FOR_PREVIEW.get(settings), - PAGE_SIZE.get(settings), + AD_PAGE_SIZE.get(settings), PREVIEW_TIMEOUT_IN_MILLIS ); } @@ -180,8 +182,9 @@ public void getLatestDataTime(AnomalyDetector detector, ActionListenerasyncRequestWithInjectedSecurity( searchRequest, client::search, - detector.getDetectorId(), + detector.getId(), client, + AnalysisType.AD, searchResponseListener ); } @@ -216,7 +219,7 @@ public void getHighestCountEntities( int pageSize, ActionListener> listener ) { - if (!detector.isMultientityDetector()) { + if (!detector.isHighCardinality()) { listener.onResponse(null); return; } @@ -231,8 +234,8 @@ public void getHighestCountEntities( BoolQueryBuilder boolQueryBuilder = QueryBuilders.boolQuery().filter(rangeQuery).filter(detector.getFilterQuery()); AggregationBuilder bucketAggs = null; - if (detector.getCategoryField().size() == 1) { - bucketAggs = AggregationBuilders.terms(AGG_NAME_TOP).size(maxEntitiesSize).field(detector.getCategoryField().get(0)); + if (detector.getCategoryFields().size() == 1) { + bucketAggs = AggregationBuilders.terms(AGG_NAME_TOP).size(maxEntitiesSize).field(detector.getCategoryFields().get(0)); } else { /* * We don't have an efficient solution for terms aggregation on multiple fields. @@ -328,7 +331,7 @@ public void getHighestCountEntities( bucketAggs = AggregationBuilders .composite( AGG_NAME_TOP, - detector.getCategoryField().stream().map(f -> new TermsValuesSourceBuilder(f).field(f)).collect(Collectors.toList()) + detector.getCategoryFields().stream().map(f -> new TermsValuesSourceBuilder(f).field(f)).collect(Collectors.toList()) ) .size(pageSize) .subAggregation( @@ -359,8 +362,9 @@ public void getHighestCountEntities( .asyncRequestWithInjectedSecurity( searchRequest, client::search, - detector.getDetectorId(), + detector.getId(), client, + AnalysisType.AD, searchResponseListener ); } @@ -410,14 +414,14 @@ public void onResponse(SearchResponse response) { return; } - if (detector.getCategoryField().size() == 1) { + if (detector.getCategoryFields().size() == 1) { topEntities = ((Terms) aggrResult) .getBuckets() .stream() .map(bucket -> bucket.getKeyAsString()) .collect(Collectors.toList()) .stream() - .map(entityValue -> Entity.createSingleAttributeEntity(detector.getCategoryField().get(0), entityValue)) + .map(entityValue -> Entity.createSingleAttributeEntity(detector.getCategoryFields().get(0), entityValue)) .collect(Collectors.toList()); listener.onResponse(topEntities); } else { @@ -438,7 +442,7 @@ public void onResponse(SearchResponse response) { listener.onResponse(topEntities); } else if (expirationEpochMs < clock.millis()) { if (topEntities.isEmpty()) { - listener.onFailure(new AnomalyDetectionException("timeout to get preview results. Please retry later.")); + listener.onFailure(new TimeSeriesException("timeout to get preview results. Please retry later.")); } else { logger.info("timeout to get preview results. Send whatever we have."); listener.onResponse(topEntities); @@ -451,8 +455,9 @@ public void onResponse(SearchResponse response) { .asyncRequestWithInjectedSecurity( new SearchRequest().indices(detector.getIndices().toArray(new String[0])).source(searchSourceBuilder), client::search, - detector.getDetectorId(), + detector.getId(), client, + AnalysisType.AD, this ); } @@ -471,23 +476,25 @@ public void onFailure(Exception e) { /** * Get the entity's earliest timestamps - * @param detector detector config + * @param config analysis config * @param entity the entity's information * @param listener listener to return back the requested timestamps */ - public void getEntityMinDataTime(AnomalyDetector detector, Entity entity, ActionListener> listener) { + public void getMinDataTime(Config config, Optional entity, AnalysisType context, ActionListener> listener) { BoolQueryBuilder internalFilterQuery = QueryBuilders.boolQuery(); - for (TermQueryBuilder term : entity.getTermQueryBuilders()) { - internalFilterQuery.filter(term); + if (entity.isPresent()) { + for (TermQueryBuilder term : entity.get().getTermQueryBuilders()) { + internalFilterQuery.filter(term); + } } SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder() .query(internalFilterQuery) - .aggregation(AggregationBuilders.min(AGG_NAME_MIN).field(detector.getTimeField())) + .aggregation(AggregationBuilders.min(AGG_NAME_MIN).field(config.getTimeField())) .trackTotalHits(false) .size(0); - SearchRequest searchRequest = new SearchRequest().indices(detector.getIndices().toArray(new String[0])).source(searchSourceBuilder); + SearchRequest searchRequest = new SearchRequest().indices(config.getIndices().toArray(new String[0])).source(searchSourceBuilder); final ActionListener searchResponseListener = ActionListener.wrap(response -> { listener.onResponse(parseMinDataTime(response)); }, listener::onFailure); @@ -496,8 +503,9 @@ public void getEntityMinDataTime(AnomalyDetector detector, Entity entity, Action .asyncRequestWithInjectedSecurity( searchRequest, client::search, - detector.getDetectorId(), + config.getId(), client, + context, searchResponseListener ); } @@ -529,8 +537,9 @@ public void getFeaturesForPeriod(AnomalyDetector detector, long startTime, long .asyncRequestWithInjectedSecurity( searchRequest, client::search, - detector.getDetectorId(), + detector.getId(), client, + AnalysisType.AD, searchResponseListener ); } @@ -543,7 +552,7 @@ public void getFeaturesForPeriodByBatch( ActionListener>> listener ) throws IOException { SearchSourceBuilder searchSourceBuilder = batchFeatureQuery(detector, entity, startTime, endTime, xContent); - logger.debug("Batch query for detector {}: {} ", detector.getDetectorId(), searchSourceBuilder); + logger.debug("Batch query for detector {}: {} ", detector.getId(), searchSourceBuilder); SearchRequest searchRequest = new SearchRequest(detector.getIndices().toArray(new String[0])).source(searchSourceBuilder); final ActionListener searchResponseListener = ActionListener.wrap(response -> { @@ -554,8 +563,9 @@ public void getFeaturesForPeriodByBatch( .asyncRequestWithInjectedSecurity( searchRequest, client::search, - detector.getDetectorId(), + detector.getId(), client, + AnalysisType.AD, searchResponseListener ); } @@ -568,7 +578,7 @@ private Map> parseBucketAggregationResponse(SearchRespo List buckets = ((InternalComposite) agg).getBuckets(); buckets.forEach(bucket -> { Optional featureData = parseAggregations(Optional.ofNullable(bucket.getAggregations()), featureIds); - dataPoints.put((Long) bucket.getKey().get(DATE_HISTOGRAM), featureData); + dataPoints.put((Long) bucket.getKey().get(CommonName.DATE_HISTOGRAM), featureData); }); } return dataPoints; @@ -583,24 +593,24 @@ public Optional parseResponse(SearchResponse response, List fe * * Sampled features are not true features. They are intended to be approximate results produced at low costs. * - * @param detector info about the indices, documents, feature query + * @param config info about the indices, documents, feature query * @param ranges list of time ranges * @param listener handle approximate features for the time ranges * @throws IOException if a user gives wrong query input when defining a detector */ public void getFeatureSamplesForPeriods( - AnomalyDetector detector, + Config config, List> ranges, + AnalysisType context, ActionListener>> listener ) throws IOException { - SearchRequest request = createPreviewSearchRequest(detector, ranges); + SearchRequest request = createPreviewSearchRequest(config, ranges); final ActionListener searchResponseListener = ActionListener.wrap(response -> { Aggregations aggs = response.getAggregations(); if (aggs == null) { listener.onResponse(Collections.emptyList()); return; } - listener .onResponse( aggs @@ -608,7 +618,7 @@ public void getFeatureSamplesForPeriods( .stream() .filter(InternalDateRange.class::isInstance) .flatMap(agg -> ((InternalDateRange) agg).getBuckets().stream()) - .map(bucket -> parseBucket(bucket, detector.getEnabledFeatureIds())) + .map(bucket -> parseBucket(bucket, config.getEnabledFeatureIds())) .collect(Collectors.toList()) ); }, listener::onFailure); @@ -617,8 +627,9 @@ public void getFeatureSamplesForPeriods( .asyncRequestWithInjectedSecurity( request, client::search, - detector.getDetectorId(), + config.getId(), client, + context, searchResponseListener ); } @@ -644,7 +655,7 @@ public void getFeaturesForSampledPeriods( ActionListener>> listener ) { Map cache = new HashMap<>(); - logger.info(String.format(Locale.ROOT, "Getting features for detector %s ending at %d", detector.getDetectorId(), endTime)); + logger.info(String.format(Locale.ROOT, "Getting features for detector %s ending at %d", detector.getId(), endTime)); getFeatureSamplesWithCache(detector, maxSamples, maxStride, endTime, cache, maxStride, listener); } @@ -698,7 +709,7 @@ private void processFeatureSamplesForStride( .format( Locale.ROOT, "Get features for detector %s finishes without any features present, current stride %d", - detector.getDetectorId(), + detector.getId(), currentStride ) ); @@ -710,7 +721,7 @@ private void processFeatureSamplesForStride( .format( Locale.ROOT, "Get features for detector %s finishes with %d samples, current stride %d", - detector.getDetectorId(), + detector.getId(), features.get().length, currentStride ) @@ -732,7 +743,7 @@ private void getFeatureSamplesForStride( ) { ArrayDeque sampledFeatures = new ArrayDeque<>(maxSamples); boolean isInterpolatable = currentStride < maxStride; - long span = ((IntervalTimeConfiguration) detector.getDetectionInterval()).toDuration().toMillis(); + long span = ((IntervalTimeConfiguration) detector.getInterval()).toDuration().toMillis(); sampleForIteration(detector, cache, maxSamples, endTime, span, currentStride, sampledFeatures, isInterpolatable, 0, listener); } @@ -824,7 +835,7 @@ private Optional toMatrix(ArrayDeque sampledFeatures) { } private double[] getInterpolants(double[] previous, double[] next) { - return transpose(interpolator.interpolate(transpose(new double[][] { previous, next }), 3))[1]; + return transpose(imputer.impute(transpose(new double[][] { previous, next }), 3))[1]; } private double[][] transpose(double[][] matrix) { @@ -837,33 +848,30 @@ private SearchRequest createFeatureSearchRequest(AnomalyDetector detector, long SearchSourceBuilder searchSourceBuilder = ParseUtils.generateInternalFeatureQuery(detector, startTime, endTime, xContent); return new SearchRequest(detector.getIndices().toArray(new String[0]), searchSourceBuilder).preference(preference.orElse(null)); } catch (IOException e) { - logger - .warn( - "Failed to create feature search request for " + detector.getDetectorId() + " from " + startTime + " to " + endTime, - e - ); + logger.warn("Failed to create feature search request for " + detector.getId() + " from " + startTime + " to " + endTime, e); throw new IllegalStateException(e); } } - private SearchRequest createPreviewSearchRequest(AnomalyDetector detector, List> ranges) throws IOException { + private SearchRequest createPreviewSearchRequest(Config config, List> ranges) throws IOException { try { - SearchSourceBuilder searchSourceBuilder = ParseUtils.generatePreviewQuery(detector, ranges, xContent); - return new SearchRequest(detector.getIndices().toArray(new String[0]), searchSourceBuilder); + SearchSourceBuilder searchSourceBuilder = ParseUtils.generatePreviewQuery(config, ranges, xContent); + return new SearchRequest(config.getIndices().toArray(new String[0]), searchSourceBuilder); } catch (IOException e) { - logger.warn("Failed to create feature search request for " + detector.getDetectorId() + " for preview", e); + logger.warn("Failed to create feature search request for " + config.getId() + " for preview", e); throw e; } } public void getColdStartSamplesForPeriods( - AnomalyDetector detector, + Config config, List> ranges, - Entity entity, + Optional entity, boolean includesEmptyBucket, + AnalysisType context, ActionListener>> listener ) { - SearchRequest request = createColdStartFeatureSearchRequest(detector, ranges, entity); + SearchRequest request = createColdStartFeatureSearchRequest(config, ranges, entity); final ActionListener searchResponseListener = ActionListener.wrap(response -> { Aggregations aggs = response.getAggregations(); if (aggs == null) { @@ -893,7 +901,7 @@ public void getColdStartSamplesForPeriods( .filter(bucket -> bucket.getFrom() != null && bucket.getFrom() instanceof ZonedDateTime) .filter(bucket -> bucket.getDocCount() > docCountThreshold) .sorted(Comparator.comparing((Bucket bucket) -> (ZonedDateTime) bucket.getFrom())) - .map(bucket -> parseBucket(bucket, detector.getEnabledFeatureIds())) + .map(bucket -> parseBucket(bucket, config.getEnabledFeatureIds())) .collect(Collectors.toList()) ); }, listener::onFailure); @@ -903,21 +911,22 @@ public void getColdStartSamplesForPeriods( .asyncRequestWithInjectedSecurity( request, client::search, - detector.getDetectorId(), + config.getId(), client, + context, searchResponseListener ); } - private SearchRequest createColdStartFeatureSearchRequest(AnomalyDetector detector, List> ranges, Entity entity) { + private SearchRequest createColdStartFeatureSearchRequest(Config detector, List> ranges, Optional entity) { try { - SearchSourceBuilder searchSourceBuilder = ParseUtils.generateEntityColdStartQuery(detector, ranges, entity, xContent); + SearchSourceBuilder searchSourceBuilder = ParseUtils.generateColdStartQuery(detector, ranges, entity, xContent); return new SearchRequest(detector.getIndices().toArray(new String[0]), searchSourceBuilder); } catch (IOException e) { logger .warn( "Failed to create cold start feature search request for " - + detector.getDetectorId() + + detector.getId() + " from " + ranges.get(0).getKey() + " to " diff --git a/src/main/java/org/opensearch/timeseries/function/BiCheckedFunction.java b/src/main/java/org/opensearch/timeseries/function/BiCheckedFunction.java new file mode 100644 index 000000000..d96b14adf --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/function/BiCheckedFunction.java @@ -0,0 +1,11 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.function; + +@FunctionalInterface +public interface BiCheckedFunction { + R apply(T t, F f) throws E; +} diff --git a/src/main/java/org/opensearch/ad/rest/handler/AnomalyDetectorFunction.java b/src/main/java/org/opensearch/timeseries/function/ExecutorFunction.java similarity index 85% rename from src/main/java/org/opensearch/ad/rest/handler/AnomalyDetectorFunction.java rename to src/main/java/org/opensearch/timeseries/function/ExecutorFunction.java index 929120561..90cd93cfb 100644 --- a/src/main/java/org/opensearch/ad/rest/handler/AnomalyDetectorFunction.java +++ b/src/main/java/org/opensearch/timeseries/function/ExecutorFunction.java @@ -9,10 +9,10 @@ * GitHub history for details. */ -package org.opensearch.ad.rest.handler; +package org.opensearch.timeseries.function; @FunctionalInterface -public interface AnomalyDetectorFunction { +public interface ExecutorFunction { /** * Performs this operation. diff --git a/src/main/java/org/opensearch/ad/util/ThrowingConsumer.java b/src/main/java/org/opensearch/timeseries/function/ThrowingConsumer.java similarity index 92% rename from src/main/java/org/opensearch/ad/util/ThrowingConsumer.java rename to src/main/java/org/opensearch/timeseries/function/ThrowingConsumer.java index d3f981552..8a7210f01 100644 --- a/src/main/java/org/opensearch/ad/util/ThrowingConsumer.java +++ b/src/main/java/org/opensearch/timeseries/function/ThrowingConsumer.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.util; +package org.opensearch.timeseries.function; /** * A consumer that can throw checked exception diff --git a/src/main/java/org/opensearch/ad/util/ThrowingSupplier.java b/src/main/java/org/opensearch/timeseries/function/ThrowingSupplier.java similarity index 92% rename from src/main/java/org/opensearch/ad/util/ThrowingSupplier.java rename to src/main/java/org/opensearch/timeseries/function/ThrowingSupplier.java index 9810ffcaf..a56f513a8 100644 --- a/src/main/java/org/opensearch/ad/util/ThrowingSupplier.java +++ b/src/main/java/org/opensearch/timeseries/function/ThrowingSupplier.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.util; +package org.opensearch.timeseries.function; /** * A supplier that can throw checked exception diff --git a/src/main/java/org/opensearch/ad/util/ThrowingSupplierWrapper.java b/src/main/java/org/opensearch/timeseries/function/ThrowingSupplierWrapper.java similarity index 96% rename from src/main/java/org/opensearch/ad/util/ThrowingSupplierWrapper.java rename to src/main/java/org/opensearch/timeseries/function/ThrowingSupplierWrapper.java index 42ceb1526..c57b11d33 100644 --- a/src/main/java/org/opensearch/ad/util/ThrowingSupplierWrapper.java +++ b/src/main/java/org/opensearch/timeseries/function/ThrowingSupplierWrapper.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.util; +package org.opensearch.timeseries.function; import java.util.function.Supplier; diff --git a/src/main/java/org/opensearch/ad/indices/AnomalyDetectionIndices.java b/src/main/java/org/opensearch/timeseries/indices/IndexManagement.java similarity index 58% rename from src/main/java/org/opensearch/ad/indices/AnomalyDetectionIndices.java rename to src/main/java/org/opensearch/timeseries/indices/IndexManagement.java index 64b242834..36134c263 100644 --- a/src/main/java/org/opensearch/ad/indices/AnomalyDetectionIndices.java +++ b/src/main/java/org/opensearch/timeseries/indices/IndexManagement.java @@ -9,19 +9,9 @@ * GitHub history for details. */ -package org.opensearch.ad.indices; - -import static org.opensearch.ad.constant.CommonErrorMessages.CAN_NOT_FIND_RESULT_INDEX; -import static org.opensearch.ad.constant.CommonName.DUMMY_AD_RESULT_ID; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_RESULT_HISTORY_MAX_DOCS_PER_SHARD; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_RESULT_HISTORY_RETENTION_PERIOD; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_RESULT_HISTORY_ROLLOVER_PERIOD; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.ANOMALY_DETECTION_STATE_INDEX_MAPPING_FILE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.ANOMALY_DETECTORS_INDEX_MAPPING_FILE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.ANOMALY_DETECTOR_JOBS_INDEX_MAPPING_FILE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.ANOMALY_RESULTS_INDEX_MAPPING_FILE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.CHECKPOINT_INDEX_MAPPING_FILE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_PRIMARY_SHARDS; +package org.opensearch.timeseries.indices; + +import static org.opensearch.timeseries.constant.CommonMessages.CAN_NOT_FIND_RESULT_INDEX; import java.io.IOException; import java.net.URL; @@ -57,15 +47,7 @@ import org.opensearch.action.index.IndexRequest; import org.opensearch.action.support.GroupedActionListener; import org.opensearch.action.support.IndicesOptions; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.constant.CommonValue; -import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.rest.handler.AnomalyDetectorFunction; -import org.opensearch.ad.util.DiscoveryNodeFilterer; +import org.opensearch.ad.indices.ADIndex; import org.opensearch.client.AdminClient; import org.opensearch.client.Client; import org.opensearch.cluster.LocalNodeClusterManagerListener; @@ -82,315 +64,705 @@ import org.opensearch.core.common.Strings; import org.opensearch.core.common.bytes.BytesArray; import org.opensearch.core.xcontent.NamedXContentRegistry; -import org.opensearch.core.xcontent.ToXContent; -import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; import org.opensearch.core.xcontent.XContentParser.Token; import org.opensearch.index.IndexNotFoundException; import org.opensearch.threadpool.Scheduler; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.constant.CommonValue; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; import com.google.common.base.Charsets; import com.google.common.io.Resources; -/** - * This class provides utility methods for various anomaly detection indices. - */ -public class AnomalyDetectionIndices implements LocalNodeClusterManagerListener { - private static final Logger logger = LogManager.getLogger(AnomalyDetectionIndices.class); - - // The index name pattern to query all the AD result history indices - public static final String AD_RESULT_HISTORY_INDEX_PATTERN = "<.opendistro-anomaly-results-history-{now/d}-1>"; - - // The index name pattern to query all AD result, history and current AD result - public static final String ALL_AD_RESULTS_INDEX_PATTERN = ".opendistro-anomaly-results*"; +public abstract class IndexManagement & TimeSeriesIndex> implements LocalNodeClusterManagerListener { + private static final Logger logger = LogManager.getLogger(IndexManagement.class); // minimum shards of the job index public static int minJobIndexReplicas = 1; // maximum shards of the job index public static int maxJobIndexReplicas = 20; - // package private for testing - static final String META = "_meta"; - private static final String SCHEMA_VERSION = "schema_version"; - - private ClusterService clusterService; - private final Client client; - private final AdminClient adminClient; - private final ThreadPool threadPool; - - private volatile TimeValue historyRolloverPeriod; - private volatile Long historyMaxDocs; - private volatile TimeValue historyRetentionPeriod; - - private Scheduler.Cancellable scheduledRollover = null; + public static final String META = "_meta"; + public static final String SCHEMA_VERSION = "schema_version"; + + protected ClusterService clusterService; + protected final Client client; + protected final AdminClient adminClient; + protected final ThreadPool threadPool; + protected DiscoveryNodeFilterer nodeFilter; + // index settings + protected final Settings settings; + // don't retry updating endlessly. Can be annoying if there are too many exception logs. + protected final int maxUpdateRunningTimes; - private DiscoveryNodeFilterer nodeFilter; - private int maxPrimaryShards; - // keep track of whether the mapping version is up-to-date - private EnumMap indexStates; // whether all index have the correct mappings - private boolean allMappingUpdated; + protected boolean allMappingUpdated; // whether all index settings are updated - private boolean allSettingUpdated; + protected boolean allSettingUpdated; // we only want one update at a time - private final AtomicBoolean updateRunning; - // don't retry updating endlessly. Can be annoying if there are too many exception logs. - private final int maxUpdateRunningTimes; + protected final AtomicBoolean updateRunning; // the number of times updates run - private int updateRunningTimes; - // AD index settings - private final Settings settings; - + protected int updateRunningTimes; + private final Class indexType; + // keep track of whether the mapping version is up-to-date + protected EnumMap indexStates; + protected int maxPrimaryShards; + private Scheduler.Cancellable scheduledRollover = null; + protected volatile TimeValue historyRolloverPeriod; + protected volatile Long historyMaxDocs; + protected volatile TimeValue historyRetentionPeriod; // result index mapping to valida custom index - private Map AD_RESULT_FIELD_CONFIGS; + private Map RESULT_FIELD_CONFIGS; + private String resultMapping; - class IndexState { + protected class IndexState { // keep track of whether the mapping version is up-to-date - private Boolean mappingUpToDate; + public Boolean mappingUpToDate; // keep track of whether the setting needs to change - private Boolean settingUpToDate; + public Boolean settingUpToDate; // record schema version reading from the mapping file - private Integer schemaVersion; + public Integer schemaVersion; - IndexState(ADIndex index) { + public IndexState(String mappingFile) { this.mappingUpToDate = false; - settingUpToDate = false; - this.schemaVersion = parseSchemaVersion(index.getMapping()); + this.settingUpToDate = false; + this.schemaVersion = IndexManagement.parseSchemaVersion(mappingFile); } } - /** - * Constructor function - * - * @param client ES client supports administrative actions - * @param clusterService ES cluster service - * @param threadPool ES thread pool - * @param settings ES cluster setting - * @param nodeFilter Used to filter eligible nodes to host AD indices - * @param maxUpdateRunningTimes max number of retries to update index mapping and setting - */ - public AnomalyDetectionIndices( + protected IndexManagement( Client client, ClusterService clusterService, ThreadPool threadPool, Settings settings, DiscoveryNodeFilterer nodeFilter, - int maxUpdateRunningTimes - ) { + int maxUpdateRunningTimes, + Class indexType, + int maxPrimaryShards, + TimeValue historyRolloverPeriod, + Long historyMaxDocs, + TimeValue historyRetentionPeriod, + String resultMapping + ) + throws IOException { this.client = client; this.adminClient = client.admin(); this.clusterService = clusterService; this.threadPool = threadPool; this.clusterService.addLocalNodeClusterManagerListener(this); - this.historyRolloverPeriod = AD_RESULT_HISTORY_ROLLOVER_PERIOD.get(settings); - this.historyMaxDocs = AD_RESULT_HISTORY_MAX_DOCS_PER_SHARD.get(settings); - this.historyRetentionPeriod = AD_RESULT_HISTORY_RETENTION_PERIOD.get(settings); - this.maxPrimaryShards = MAX_PRIMARY_SHARDS.get(settings); - this.nodeFilter = nodeFilter; - - this.indexStates = new EnumMap(ADIndex.class); + this.settings = Settings.builder().put("index.hidden", true).build(); + this.maxUpdateRunningTimes = maxUpdateRunningTimes; + this.indexType = indexType; + this.maxPrimaryShards = maxPrimaryShards; + this.historyRolloverPeriod = historyRolloverPeriod; + this.historyMaxDocs = historyMaxDocs; + this.historyRetentionPeriod = historyRetentionPeriod; this.allMappingUpdated = false; this.allSettingUpdated = false; this.updateRunning = new AtomicBoolean(false); + this.updateRunningTimes = 0; + this.resultMapping = resultMapping; + } - this.clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_RESULT_HISTORY_MAX_DOCS_PER_SHARD, it -> historyMaxDocs = it); + /** + * Alias exists or not + * @param alias Alias name + * @return true if the alias exists + */ + public boolean doesAliasExist(String alias) { + return clusterService.state().metadata().hasAlias(alias); + } - this.clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_RESULT_HISTORY_ROLLOVER_PERIOD, it -> { - historyRolloverPeriod = it; - rescheduleRollover(); - }); - this.clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_RESULT_HISTORY_RETENTION_PERIOD, it -> { - historyRetentionPeriod = it; - }); + public static Integer parseSchemaVersion(String mapping) { + try { + XContentParser xcp = XContentType.JSON + .xContent() + .createParser(NamedXContentRegistry.EMPTY, LoggingDeprecationHandler.INSTANCE, mapping); - this.clusterService.getClusterSettings().addSettingsUpdateConsumer(MAX_PRIMARY_SHARDS, it -> maxPrimaryShards = it); + while (!xcp.isClosed()) { + Token token = xcp.currentToken(); + if (token != null && token != XContentParser.Token.END_OBJECT && token != XContentParser.Token.START_OBJECT) { + if (xcp.currentName() != IndexManagement.META) { + xcp.nextToken(); + xcp.skipChildren(); + } else { + while (xcp.nextToken() != XContentParser.Token.END_OBJECT) { + if (xcp.currentName().equals(IndexManagement.SCHEMA_VERSION)) { - this.settings = Settings.builder().put("index.hidden", true).build(); + Integer version = xcp.intValue(); + if (version < 0) { + version = CommonValue.NO_SCHEMA_VERSION; + } + return version; + } else { + xcp.nextToken(); + } + } - this.maxUpdateRunningTimes = maxUpdateRunningTimes; - this.updateRunningTimes = 0; + } + } + xcp.nextToken(); + } + return CommonValue.NO_SCHEMA_VERSION; + } catch (Exception e) { + // since this method is called in the constructor that is called by TimeSeriesAnalyticsPlugin.createComponents, + // we cannot throw checked exception + throw new RuntimeException(e); + } + } - this.AD_RESULT_FIELD_CONFIGS = null; + protected static Integer getIntegerSetting(GetSettingsResponse settingsResponse, String settingKey) { + Integer value = null; + for (Settings settings : settingsResponse.getIndexToSettings().values()) { + value = settings.getAsInt(settingKey, null); + if (value != null) { + break; + } + } + return value; } - private void initResultMapping() throws IOException { - if (AD_RESULT_FIELD_CONFIGS != null) { - // we have already initiated the field + protected static String getStringSetting(GetSettingsResponse settingsResponse, String settingKey) { + String value = null; + for (Settings settings : settingsResponse.getIndexToSettings().values()) { + value = settings.get(settingKey, null); + if (value != null) { + break; + } + } + return value; + } + + public boolean doesIndexExist(String indexName) { + return clusterService.state().metadata().hasIndex(indexName); + } + + protected static String getMappings(String mappingFileRelativePath) throws IOException { + URL url = IndexManagement.class.getClassLoader().getResource(mappingFileRelativePath); + return Resources.toString(url, Charsets.UTF_8); + } + + protected void choosePrimaryShards(CreateIndexRequest request, boolean hiddenIndex) { + request + .settings( + Settings + .builder() + // put 1 primary shards per hot node if possible + .put(IndexMetadata.SETTING_NUMBER_OF_SHARDS, getNumberOfPrimaryShards()) + // 1 replica for better search performance and fail-over + .put(IndexMetadata.SETTING_NUMBER_OF_REPLICAS, 1) + .put("index.hidden", hiddenIndex) + ); + } + + protected void deleteOldHistoryIndices(String indexPattern, TimeValue historyRetentionPeriod) { + Set candidates = new HashSet(); + + ClusterStateRequest clusterStateRequest = new ClusterStateRequest() + .clear() + .indices(indexPattern) + .metadata(true) + .local(true) + .indicesOptions(IndicesOptions.strictExpand()); + + adminClient.cluster().state(clusterStateRequest, ActionListener.wrap(clusterStateResponse -> { + String latestToDelete = null; + long latest = Long.MIN_VALUE; + for (IndexMetadata indexMetaData : clusterStateResponse.getState().metadata().indices().values()) { + long creationTime = indexMetaData.getCreationDate(); + if ((Instant.now().toEpochMilli() - creationTime) > historyRetentionPeriod.millis()) { + String indexName = indexMetaData.getIndex().getName(); + candidates.add(indexName); + if (latest < creationTime) { + latest = creationTime; + latestToDelete = indexName; + } + } + } + if (candidates.size() > 1) { + // delete all indices except the last one because the last one may contain docs newer than the retention period + candidates.remove(latestToDelete); + String[] toDelete = candidates.toArray(Strings.EMPTY_ARRAY); + DeleteIndexRequest deleteIndexRequest = new DeleteIndexRequest(toDelete); + adminClient.indices().delete(deleteIndexRequest, ActionListener.wrap(deleteIndexResponse -> { + if (!deleteIndexResponse.isAcknowledged()) { + logger.error("Could not delete one or more result indices: {}. Retrying one by one.", Arrays.toString(toDelete)); + deleteIndexIteration(toDelete); + } else { + logger.info("Succeeded in deleting expired result indices: {}.", Arrays.toString(toDelete)); + } + }, exception -> { + logger.error("Failed to delete expired result indices: {}.", Arrays.toString(toDelete)); + deleteIndexIteration(toDelete); + })); + } + }, exception -> { logger.error("Fail to delete result indices", exception); })); + } + + protected void deleteIndexIteration(String[] toDelete) { + for (String index : toDelete) { + DeleteIndexRequest singleDeleteRequest = new DeleteIndexRequest(index); + adminClient.indices().delete(singleDeleteRequest, ActionListener.wrap(singleDeleteResponse -> { + if (!singleDeleteResponse.isAcknowledged()) { + logger.error("Retrying deleting {} does not succeed.", index); + } + }, exception -> { + if (exception instanceof IndexNotFoundException) { + logger.info("{} was already deleted.", index); + } else { + logger.error(new ParameterizedMessage("Retrying deleting {} does not succeed.", index), exception); + } + })); + } + } + + @SuppressWarnings("unchecked") + protected void shouldUpdateConcreteIndex(String concreteIndex, Integer newVersion, ActionListener thenDo) { + IndexMetadata indexMeataData = clusterService.state().getMetadata().indices().get(concreteIndex); + if (indexMeataData == null) { + thenDo.onResponse(Boolean.FALSE); return; } - String resultMapping = getAnomalyResultMappings(); + Integer oldVersion = CommonValue.NO_SCHEMA_VERSION; - Map asMap = XContentHelper.convertToMap(new BytesArray(resultMapping), false, XContentType.JSON).v2(); - Object properties = asMap.get(CommonName.PROPERTIES); - if (properties instanceof Map) { - AD_RESULT_FIELD_CONFIGS = (Map) properties; - } else { - logger.error("Fail to read result mapping file."); + Map indexMapping = indexMeataData.mapping().getSourceAsMap(); + Object meta = indexMapping.get(IndexManagement.META); + if (meta != null && meta instanceof Map) { + Map metaMapping = (Map) meta; + Object schemaVersion = metaMapping.get(org.opensearch.timeseries.constant.CommonName.SCHEMA_VERSION_FIELD); + if (schemaVersion instanceof Integer) { + oldVersion = (Integer) schemaVersion; + } } + thenDo.onResponse(newVersion > oldVersion); } - /** - * Get anomaly detector index mapping json content. - * - * @return anomaly detector index mapping - * @throws IOException IOException if mapping file can't be read correctly - */ - public static String getAnomalyDetectorMappings() throws IOException { - URL url = AnomalyDetectionIndices.class.getClassLoader().getResource(ANOMALY_DETECTORS_INDEX_MAPPING_FILE); - return Resources.toString(url, Charsets.UTF_8); + protected void updateJobIndexSettingIfNecessary(String indexName, IndexState jobIndexState, ActionListener listener) { + GetSettingsRequest getSettingsRequest = new GetSettingsRequest() + .indices(indexName) + .names( + new String[] { + IndexMetadata.SETTING_NUMBER_OF_SHARDS, + IndexMetadata.SETTING_NUMBER_OF_REPLICAS, + IndexMetadata.SETTING_AUTO_EXPAND_REPLICAS } + ); + client.execute(GetSettingsAction.INSTANCE, getSettingsRequest, ActionListener.wrap(settingResponse -> { + // auto expand setting is a range string like "1-all" + String autoExpandReplica = getStringSetting(settingResponse, IndexMetadata.SETTING_AUTO_EXPAND_REPLICAS); + // if the auto expand setting is already there, return immediately + if (autoExpandReplica != null) { + jobIndexState.settingUpToDate = true; + logger.info(new ParameterizedMessage("Mark [{}]'s mapping up-to-date", indexName)); + listener.onResponse(null); + return; + } + Integer primaryShardsNumber = getIntegerSetting(settingResponse, IndexMetadata.SETTING_NUMBER_OF_SHARDS); + Integer replicaNumber = getIntegerSetting(settingResponse, IndexMetadata.SETTING_NUMBER_OF_REPLICAS); + if (primaryShardsNumber == null || replicaNumber == null) { + logger + .error( + new ParameterizedMessage( + "Fail to find job index's primary or replica shard number: primary [{}], replica [{}]", + primaryShardsNumber, + replicaNumber + ) + ); + // don't throw exception as we don't know how to handle it and retry next time + listener.onResponse(null); + return; + } + // at least minJobIndexReplicas + // at most maxJobIndexReplicas / primaryShardsNumber replicas. + // For example, if we have 2 primary shards, since the max number of shards are maxJobIndexReplicas (20), + // we will use 20 / 2 = 10 replicas as the upper bound of replica. + int maxExpectedReplicas = Math + .max(IndexManagement.maxJobIndexReplicas / primaryShardsNumber, IndexManagement.minJobIndexReplicas); + Settings updatedSettings = Settings + .builder() + .put(IndexMetadata.SETTING_AUTO_EXPAND_REPLICAS, IndexManagement.minJobIndexReplicas + "-" + maxExpectedReplicas) + .build(); + final UpdateSettingsRequest updateSettingsRequest = new UpdateSettingsRequest(indexName).settings(updatedSettings); + client.admin().indices().updateSettings(updateSettingsRequest, ActionListener.wrap(response -> { + jobIndexState.settingUpToDate = true; + logger.info(new ParameterizedMessage("Mark [{}]'s mapping up-to-date", indexName)); + listener.onResponse(null); + }, listener::onFailure)); + }, e -> { + if (e instanceof IndexNotFoundException) { + // new index will be created with auto expand replica setting + jobIndexState.settingUpToDate = true; + logger.info(new ParameterizedMessage("Mark [{}]'s mapping up-to-date", indexName)); + listener.onResponse(null); + } else { + listener.onFailure(e); + } + })); } /** - * Get anomaly result index mapping json content. + * Create config index if not exist. * - * @return anomaly result index mapping - * @throws IOException IOException if mapping file can't be read correctly + * @param actionListener action called after create index + * @throws IOException IOException from {@link IndexManagement#getConfigMappings} */ - public static String getAnomalyResultMappings() throws IOException { - URL url = AnomalyDetectionIndices.class.getClassLoader().getResource(ANOMALY_RESULTS_INDEX_MAPPING_FILE); - return Resources.toString(url, Charsets.UTF_8); + public void initConfigIndexIfAbsent(ActionListener actionListener) throws IOException { + if (!doesConfigIndexExist()) { + initConfigIndex(actionListener); + } } /** - * Get anomaly detector job index mapping json content. + * Create config index directly. * - * @return anomaly detector job index mapping - * @throws IOException IOException if mapping file can't be read correctly + * @param actionListener action called after create index + * @throws IOException IOException from {@link IndexManagement#getConfigMappings} */ - public static String getAnomalyDetectorJobMappings() throws IOException { - URL url = AnomalyDetectionIndices.class.getClassLoader().getResource(ANOMALY_DETECTOR_JOBS_INDEX_MAPPING_FILE); - return Resources.toString(url, Charsets.UTF_8); + public void initConfigIndex(ActionListener actionListener) throws IOException { + CreateIndexRequest request = new CreateIndexRequest(CommonName.CONFIG_INDEX) + .mapping(getConfigMappings(), XContentType.JSON) + .settings(settings); + adminClient.indices().create(request, actionListener); } /** - * Get anomaly detector state index mapping json content. + * Config index exist or not. * - * @return anomaly detector state index mapping - * @throws IOException IOException if mapping file can't be read correctly + * @return true if config index exists */ - public static String getDetectionStateMappings() throws IOException { - URL url = AnomalyDetectionIndices.class.getClassLoader().getResource(ANOMALY_DETECTION_STATE_INDEX_MAPPING_FILE); - String detectionStateMappings = Resources.toString(url, Charsets.UTF_8); - String detectorIndexMappings = AnomalyDetectionIndices.getAnomalyDetectorMappings(); - detectorIndexMappings = detectorIndexMappings - .substring(detectorIndexMappings.indexOf("\"properties\""), detectorIndexMappings.lastIndexOf("}")); - return detectionStateMappings.replace("DETECTOR_INDEX_MAPPING_PLACE_HOLDER", detectorIndexMappings); + public boolean doesConfigIndexExist() { + return doesIndexExist(CommonName.CONFIG_INDEX); } /** - * Get checkpoint index mapping json content. + * Job index exist or not. * - * @return checkpoint index mapping - * @throws IOException IOException if mapping file can't be read correctly + * @return true if anomaly detector job index exists */ - public static String getCheckpointMappings() throws IOException { - URL url = AnomalyDetectionIndices.class.getClassLoader().getResource(CHECKPOINT_INDEX_MAPPING_FILE); - return Resources.toString(url, Charsets.UTF_8); + public boolean doesJobIndexExist() { + return doesIndexExist(CommonName.JOB_INDEX); } /** - * Anomaly detector index exist or not. + * Get config index mapping in json format. * - * @return true if anomaly detector index exists + * @return config index mapping + * @throws IOException IOException if mapping file can't be read correctly */ - public boolean doesAnomalyDetectorIndexExist() { - return clusterService.state().getRoutingTable().hasIndex(AnomalyDetector.ANOMALY_DETECTORS_INDEX); + public static String getConfigMappings() throws IOException { + return getMappings(TimeSeriesSettings.CONFIG_INDEX_MAPPING_FILE); } /** - * Anomaly detector job index exist or not. + * Get job index mapping in json format. * - * @return true if anomaly detector job index exists + * @return job index mapping + * @throws IOException IOException if mapping file can't be read correctly */ - public boolean doesAnomalyDetectorJobIndexExist() { - return clusterService.state().getRoutingTable().hasIndex(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX); + public static String getJobMappings() throws IOException { + return getMappings(TimeSeriesSettings.JOBS_INDEX_MAPPING_FILE); } /** - * anomaly result index exist or not. + * Createjob index. * - * @return true if anomaly result index exists + * @param actionListener action called after create index */ - public boolean doesDefaultAnomalyResultIndexExist() { - return clusterService.state().metadata().hasAlias(CommonName.ANOMALY_RESULT_INDEX_ALIAS); - } - - public boolean doesIndexExist(String indexName) { - return clusterService.state().metadata().hasIndex(indexName); - } - - public void initCustomResultIndexAndExecute(String resultIndex, AnomalyDetectorFunction function, ActionListener listener) { + public void initJobIndex(ActionListener actionListener) { try { - if (!doesIndexExist(resultIndex)) { - initCustomAnomalyResultIndexDirectly(resultIndex, ActionListener.wrap(response -> { - if (response.isAcknowledged()) { - logger.info("Successfully created anomaly detector result index {}", resultIndex); - validateCustomResultIndexAndExecute(resultIndex, function, listener); - } else { - String error = "Creating anomaly detector result index with mappings call not acknowledged: " + resultIndex; - logger.error(error); - listener.onFailure(new EndRunException(error, true)); - } - }, exception -> { - if (ExceptionsHelper.unwrapCause(exception) instanceof ResourceAlreadyExistsException) { - // It is possible the index has been created while we sending the create request - validateCustomResultIndexAndExecute(resultIndex, function, listener); - } else { - logger.error("Failed to create anomaly detector result index " + resultIndex, exception); - listener.onFailure(exception); - } - })); - } else { - validateCustomResultIndexAndExecute(resultIndex, function, listener); - } - } catch (Exception e) { - logger.error("Failed to create custom result index " + resultIndex, e); - listener.onFailure(e); + CreateIndexRequest request = new CreateIndexRequest(CommonName.JOB_INDEX).mapping(getJobMappings(), XContentType.JSON); + request + .settings( + Settings + .builder() + // AD job index is small. 1 primary shard is enough + .put(IndexMetadata.SETTING_NUMBER_OF_SHARDS, 1) + // Job scheduler puts both primary and replica shards in the + // hash ring. Auto-expand the number of replicas based on the + // number of data nodes (up to 20) in the cluster so that each node can + // become a coordinating node. This is useful when customers + // scale out their cluster so that we can do adaptive scaling + // accordingly. + // At least 1 replica for fail-over. + .put(IndexMetadata.SETTING_AUTO_EXPAND_REPLICAS, minJobIndexReplicas + "-" + maxJobIndexReplicas) + .put("index.hidden", true) + ); + adminClient.indices().create(request, actionListener); + } catch (IOException e) { + logger.error("Fail to init AD job index", e); + actionListener.onFailure(e); } } - public void validateCustomResultIndexAndExecute(String resultIndex, AnomalyDetectorFunction function, ActionListener listener) { + public void validateCustomResultIndexAndExecute(String resultIndex, ExecutorFunction function, ActionListener listener) { try { if (!isValidResultIndexMapping(resultIndex)) { logger.warn("Can't create detector with custom result index {} as its mapping is invalid", resultIndex); - listener.onFailure(new IllegalArgumentException(CommonErrorMessages.INVALID_RESULT_INDEX_MAPPING + resultIndex)); + listener.onFailure(new IllegalArgumentException(CommonMessages.INVALID_RESULT_INDEX_MAPPING + resultIndex)); return; } - AnomalyResult dummyResult = AnomalyResult.getDummyResult(); - IndexRequest indexRequest = new IndexRequest(resultIndex) - .id(DUMMY_AD_RESULT_ID) - .source(dummyResult.toXContent(XContentBuilder.builder(XContentType.JSON.xContent()), ToXContent.EMPTY_PARAMS)); + IndexRequest indexRequest = createDummyIndexRequest(resultIndex); + // User may have no write permission on custom result index. Talked with security plugin team, seems no easy way to verify - // if user has write permission. So just tried to write and delete a dummy anomaly result to verify. + // if user has write permission. So just tried to write and delete a dummy forecast result to verify. client.index(indexRequest, ActionListener.wrap(response -> { - logger.debug("Successfully wrote dummy AD result to result index {}", resultIndex); - client.delete(new DeleteRequest(resultIndex).id(DUMMY_AD_RESULT_ID), ActionListener.wrap(deleteResponse -> { - logger.debug("Successfully deleted dummy AD result from result index {}", resultIndex); + logger.debug("Successfully wrote dummy result to result index {}", resultIndex); + client.delete(createDummyDeleteRequest(resultIndex), ActionListener.wrap(deleteResponse -> { + logger.debug("Successfully deleted dummy result from result index {}", resultIndex); function.execute(); }, ex -> { - logger.error("Failed to delete dummy AD result from result index " + resultIndex, ex); + logger.error("Failed to delete dummy result from result index " + resultIndex, ex); listener.onFailure(ex); })); }, exception -> { - logger.error("Failed to write dummy AD result to result index " + resultIndex, exception); + logger.error("Failed to write dummy result to result index " + resultIndex, exception); listener.onFailure(exception); })); } catch (Exception e) { - logger.error("Failed to create detector with custom result index " + resultIndex, e); + logger.error("Failed to validate custom result index " + resultIndex, e); listener.onFailure(e); } } + public void update() { + if ((allMappingUpdated && allSettingUpdated) || updateRunningTimes >= maxUpdateRunningTimes || updateRunning.get()) { + return; + } + updateRunning.set(true); + updateRunningTimes++; + + // set updateRunning to false when both updateMappingIfNecessary and updateSettingIfNecessary + // stop running + final GroupedActionListener groupListeneer = new GroupedActionListener<>( + ActionListener.wrap(r -> updateRunning.set(false), exception -> { + updateRunning.set(false); + logger.error("Fail to update time series indices", exception); + }), + // 2 since we need both updateMappingIfNecessary and updateSettingIfNecessary to return + // before setting updateRunning to false + 2 + ); + + updateMappingIfNecessary(groupListeneer); + updateSettingIfNecessary(groupListeneer); + } + + private void updateSettingIfNecessary(GroupedActionListener delegateListeneer) { + if (allSettingUpdated) { + delegateListeneer.onResponse(null); + return; + } + + List updates = new ArrayList<>(); + for (IndexType index : indexType.getEnumConstants()) { + Boolean updated = indexStates.computeIfAbsent(index, k -> new IndexState(k.getMapping())).settingUpToDate; + if (Boolean.FALSE.equals(updated)) { + updates.add(index); + } + } + if (updates.size() == 0) { + allSettingUpdated = true; + delegateListeneer.onResponse(null); + return; + } + + final GroupedActionListener conglomerateListeneer = new GroupedActionListener<>( + ActionListener.wrap(r -> delegateListeneer.onResponse(null), exception -> { + delegateListeneer.onResponse(null); + logger.error("Fail to update time series indices' mappings", exception); + }), + updates.size() + ); + for (IndexType timeseriesIndex : updates) { + logger.info(new ParameterizedMessage("Check [{}]'s setting", timeseriesIndex.getIndexName())); + if (timeseriesIndex.isJobIndex()) { + updateJobIndexSettingIfNecessary( + ADIndex.JOB.getIndexName(), + indexStates.computeIfAbsent(timeseriesIndex, k -> new IndexState(k.getMapping())), + conglomerateListeneer + ); + } else { + // we don't have settings to update for other indices + IndexState indexState = indexStates.computeIfAbsent(timeseriesIndex, k -> new IndexState(k.getMapping())); + indexState.settingUpToDate = true; + logger.info(new ParameterizedMessage("Mark [{}]'s setting up-to-date", timeseriesIndex.getIndexName())); + conglomerateListeneer.onResponse(null); + } + } + } + + /** + * Update mapping if schema version changes. + */ + private void updateMappingIfNecessary(GroupedActionListener delegateListeneer) { + if (allMappingUpdated) { + delegateListeneer.onResponse(null); + return; + } + + List updates = new ArrayList<>(); + for (IndexType index : indexType.getEnumConstants()) { + Boolean updated = indexStates.computeIfAbsent(index, k -> new IndexState(k.getMapping())).mappingUpToDate; + if (Boolean.FALSE.equals(updated)) { + updates.add(index); + } + } + if (updates.size() == 0) { + allMappingUpdated = true; + delegateListeneer.onResponse(null); + return; + } + + final GroupedActionListener conglomerateListeneer = new GroupedActionListener<>( + ActionListener.wrap(r -> delegateListeneer.onResponse(null), exception -> { + delegateListeneer.onResponse(null); + logger.error("Fail to update time series indices' mappings", exception); + }), + updates.size() + ); + + for (IndexType adIndex : updates) { + logger.info(new ParameterizedMessage("Check [{}]'s mapping", adIndex.getIndexName())); + shouldUpdateIndex(adIndex, ActionListener.wrap(shouldUpdate -> { + if (shouldUpdate) { + adminClient + .indices() + .putMapping( + new PutMappingRequest().indices(adIndex.getIndexName()).source(adIndex.getMapping(), XContentType.JSON), + ActionListener.wrap(putMappingResponse -> { + if (putMappingResponse.isAcknowledged()) { + logger.info(new ParameterizedMessage("Succeeded in updating [{}]'s mapping", adIndex.getIndexName())); + markMappingUpdated(adIndex); + } else { + logger.error(new ParameterizedMessage("Fail to update [{}]'s mapping", adIndex.getIndexName())); + } + conglomerateListeneer.onResponse(null); + }, exception -> { + logger + .error( + new ParameterizedMessage( + "Fail to update [{}]'s mapping due to [{}]", + adIndex.getIndexName(), + exception.getMessage() + ) + ); + conglomerateListeneer.onFailure(exception); + }) + ); + } else { + // index does not exist or the version is already up-to-date. + // When creating index, new mappings will be used. + // We don't need to update it. + logger.info(new ParameterizedMessage("We don't need to update [{}]'s mapping", adIndex.getIndexName())); + markMappingUpdated(adIndex); + conglomerateListeneer.onResponse(null); + } + }, exception -> { + logger + .error( + new ParameterizedMessage("Fail to check whether we should update [{}]'s mapping", adIndex.getIndexName()), + exception + ); + conglomerateListeneer.onFailure(exception); + })); + + } + } + + private void markMappingUpdated(IndexType adIndex) { + IndexState indexState = indexStates.computeIfAbsent(adIndex, k -> new IndexState(k.getMapping())); + if (Boolean.FALSE.equals(indexState.mappingUpToDate)) { + indexState.mappingUpToDate = Boolean.TRUE; + logger.info(new ParameterizedMessage("Mark [{}]'s mapping up-to-date", adIndex.getIndexName())); + } + } + + private void shouldUpdateIndex(IndexType index, ActionListener thenDo) { + boolean exists = false; + if (index.isAlias()) { + exists = doesAliasExist(index.getIndexName()); + } else { + exists = doesIndexExist(index.getIndexName()); + } + if (false == exists) { + thenDo.onResponse(Boolean.FALSE); + return; + } + + Integer newVersion = indexStates.computeIfAbsent(index, k -> new IndexState(k.getMapping())).schemaVersion; + if (index.isAlias()) { + GetAliasesRequest getAliasRequest = new GetAliasesRequest() + .aliases(index.getIndexName()) + .indicesOptions(IndicesOptions.lenientExpandOpenHidden()); + adminClient.indices().getAliases(getAliasRequest, ActionListener.wrap(getAliasResponse -> { + String concreteIndex = null; + for (Map.Entry> entry : getAliasResponse.getAliases().entrySet()) { + if (false == entry.getValue().isEmpty()) { + // we assume the alias map to one concrete index, thus we can return after finding one + concreteIndex = entry.getKey(); + break; + } + } + if (concreteIndex == null) { + thenDo.onResponse(Boolean.FALSE); + return; + } + shouldUpdateConcreteIndex(concreteIndex, newVersion, thenDo); + }, exception -> logger.error(new ParameterizedMessage("Fail to get [{}]'s alias", index.getIndexName()), exception))); + } else { + shouldUpdateConcreteIndex(index.getIndexName(), newVersion, thenDo); + } + } + + /** + * + * @param index Index metadata + * @return The schema version of the given Index + */ + public int getSchemaVersion(IndexType index) { + IndexState indexState = this.indexStates.computeIfAbsent(index, k -> new IndexState(k.getMapping())); + return indexState.schemaVersion; + } + + public void initCustomResultIndexAndExecute(String resultIndex, ExecutorFunction function, ActionListener listener) { + if (!doesIndexExist(resultIndex)) { + initCustomResultIndexDirectly(resultIndex, ActionListener.wrap(response -> { + if (response.isAcknowledged()) { + logger.info("Successfully created result index {}", resultIndex); + validateCustomResultIndexAndExecute(resultIndex, function, listener); + } else { + String error = "Creating result index with mappings call not acknowledged: " + resultIndex; + logger.error(error); + listener.onFailure(new EndRunException(error, false)); + } + }, exception -> { + if (ExceptionsHelper.unwrapCause(exception) instanceof ResourceAlreadyExistsException) { + // It is possible the index has been created while we sending the create request + validateCustomResultIndexAndExecute(resultIndex, function, listener); + } else { + logger.error("Failed to create result index " + resultIndex, exception); + listener.onFailure(exception); + } + })); + } else { + validateCustomResultIndexAndExecute(resultIndex, function, listener); + } + } + public void validateCustomIndexForBackendJob( String resultIndex, String securityLogId, String user, List roles, - AnomalyDetectorFunction function, + ExecutorFunction function, ActionListener listener ) { if (!doesIndexExist(resultIndex)) { @@ -417,739 +789,199 @@ public void validateCustomIndexForBackendJob( } } - /** - * Check if custom result index has correct index mapping. - * @param resultIndex result index - * @return true if result index mapping is valid - */ - public boolean isValidResultIndexMapping(String resultIndex) { - try { - initResultMapping(); - if (AD_RESULT_FIELD_CONFIGS == null) { - // failed to populate the field - return false; - } - IndexMetadata indexMetadata = clusterService.state().metadata().index(resultIndex); - Map indexMapping = indexMetadata.mapping().sourceAsMap(); - String propertyName = CommonName.PROPERTIES; - if (!indexMapping.containsKey(propertyName) || !(indexMapping.get(propertyName) instanceof LinkedHashMap)) { - return false; - } - LinkedHashMap mapping = (LinkedHashMap) indexMapping.get(propertyName); + protected int getNumberOfPrimaryShards() { + return Math.min(nodeFilter.getNumberOfEligibleDataNodes(), maxPrimaryShards); + } - boolean correctResultIndexMapping = true; + @Override + public void onClusterManager() { + try { + // try to rollover immediately as we might be restarting the cluster + rolloverAndDeleteHistoryIndex(); - for (String fieldName : AD_RESULT_FIELD_CONFIGS.keySet()) { - Object defaultSchema = AD_RESULT_FIELD_CONFIGS.get(fieldName); - // the field might be a map or map of map - // example: map: {type=date, format=strict_date_time||epoch_millis} - // map of map: {type=nested, properties={likelihood={type=double}, value_list={type=nested, properties={data={type=double}, - // feature_id={type=keyword}}}}} - // if it is a map of map, Object.equals can compare them regardless of order - if (!mapping.containsKey(fieldName) || !defaultSchema.equals(mapping.get(fieldName))) { - correctResultIndexMapping = false; - break; - } - } - return correctResultIndexMapping; + // schedule the next rollover for approx MAX_AGE later + scheduledRollover = threadPool + .scheduleWithFixedDelay(() -> rolloverAndDeleteHistoryIndex(), historyRolloverPeriod, executorName()); } catch (Exception e) { - logger.error("Failed to validate result index mapping for index " + resultIndex, e); - return false; + // This should be run on cluster startup + logger.error("Error rollover result indices. " + "Can't rollover result until clusterManager node is restarted.", e); } - } - /** - * Anomaly state index exist or not. - * - * @return true if anomaly state index exists - */ - public boolean doesDetectorStateIndexExist() { - return clusterService.state().getRoutingTable().hasIndex(CommonName.DETECTION_STATE_INDEX); - } - - /** - * Checkpoint index exist or not. - * - * @return true if checkpoint index exists - */ - public boolean doesCheckpointIndexExist() { - return clusterService.state().getRoutingTable().hasIndex(CommonName.CHECKPOINT_INDEX_NAME); - } - - /** - * Index exists or not - * @param clusterServiceAccessor Cluster service - * @param name Index name - * @return true if the index exists - */ - public static boolean doesIndexExists(ClusterService clusterServiceAccessor, String name) { - return clusterServiceAccessor.state().getRoutingTable().hasIndex(name); + @Override + public void offClusterManager() { + if (scheduledRollover != null) { + scheduledRollover.cancel(); + } } - /** - * Alias exists or not - * @param clusterServiceAccessor Cluster service - * @param alias Alias name - * @return true if the alias exists - */ - public static boolean doesAliasExists(ClusterService clusterServiceAccessor, String alias) { - return clusterServiceAccessor.state().metadata().hasAlias(alias); + private String executorName() { + return ThreadPool.Names.MANAGEMENT; } - private ActionListener markMappingUpToDate(ADIndex index, ActionListener followingListener) { - return ActionListener.wrap(createdResponse -> { - if (createdResponse.isAcknowledged()) { - IndexState indexStatetate = indexStates.computeIfAbsent(index, IndexState::new); - if (Boolean.FALSE.equals(indexStatetate.mappingUpToDate)) { - indexStatetate.mappingUpToDate = Boolean.TRUE; - logger.info(new ParameterizedMessage("Mark [{}]'s mapping up-to-date", index.getIndexName())); - } + protected void rescheduleRollover() { + if (clusterService.state().getNodes().isLocalNodeElectedClusterManager()) { + if (scheduledRollover != null) { + scheduledRollover.cancel(); } - followingListener.onResponse(createdResponse); - }, exception -> followingListener.onFailure(exception)); - } - - /** - * Create anomaly detector index if not exist. - * - * @param actionListener action called after create index - * @throws IOException IOException from {@link AnomalyDetectionIndices#getAnomalyDetectorMappings} - */ - public void initAnomalyDetectorIndexIfAbsent(ActionListener actionListener) throws IOException { - if (!doesAnomalyDetectorIndexExist()) { - initAnomalyDetectorIndex(actionListener); + scheduledRollover = threadPool + .scheduleWithFixedDelay(() -> rolloverAndDeleteHistoryIndex(), historyRolloverPeriod, executorName()); } } - /** - * Create anomaly detector index directly. - * - * @param actionListener action called after create index - * @throws IOException IOException from {@link AnomalyDetectionIndices#getAnomalyDetectorMappings} - */ - public void initAnomalyDetectorIndex(ActionListener actionListener) throws IOException { - CreateIndexRequest request = new CreateIndexRequest(AnomalyDetector.ANOMALY_DETECTORS_INDEX) - .mapping(getAnomalyDetectorMappings(), XContentType.JSON) - .settings(settings); - adminClient.indices().create(request, markMappingUpToDate(ADIndex.CONFIG, actionListener)); - } - - /** - * Create anomaly result index if not exist. - * - * @param actionListener action called after create index - * @throws IOException IOException from {@link AnomalyDetectionIndices#getAnomalyResultMappings} - */ - public void initDefaultAnomalyResultIndexIfAbsent(ActionListener actionListener) throws IOException { - if (!doesDefaultAnomalyResultIndexExist()) { - initDefaultAnomalyResultIndexDirectly(actionListener); + private void initResultMapping() throws IOException { + if (RESULT_FIELD_CONFIGS != null) { + // we have already initiated the field + return; } - } - - /** - * choose the number of primary shards for checkpoint, multientity result, and job scheduler based on the number of hot nodes. Max 10. - * @param request The request to add the setting - */ - private void choosePrimaryShards(CreateIndexRequest request) { - choosePrimaryShards(request, true); - } - - private void choosePrimaryShards(CreateIndexRequest request, boolean hiddenIndex) { - request - .settings( - Settings - .builder() - // put 1 primary shards per hot node if possible - .put(IndexMetadata.SETTING_NUMBER_OF_SHARDS, getNumberOfPrimaryShards()) - // 1 replica for better search performance and fail-over - .put(IndexMetadata.SETTING_NUMBER_OF_REPLICAS, 1) - .put("index.hidden", hiddenIndex) - ); - } - - private int getNumberOfPrimaryShards() { - return Math.min(nodeFilter.getNumberOfEligibleDataNodes(), maxPrimaryShards); - } - - /** - * Create anomaly result index without checking exist or not. - * - * @param actionListener action called after create index - * @throws IOException IOException from {@link AnomalyDetectionIndices#getAnomalyResultMappings} - */ - public void initDefaultAnomalyResultIndexDirectly(ActionListener actionListener) throws IOException { - initAnomalyResultIndexDirectly(AD_RESULT_HISTORY_INDEX_PATTERN, CommonName.ANOMALY_RESULT_INDEX_ALIAS, true, actionListener); - } - public void initCustomAnomalyResultIndexDirectly(String resultIndex, ActionListener actionListener) - throws IOException { - initAnomalyResultIndexDirectly(resultIndex, null, false, actionListener); - } - - public void initAnomalyResultIndexDirectly( - String resultIndex, - String alias, - boolean hiddenIndex, - ActionListener actionListener - ) throws IOException { - String mapping = getAnomalyResultMappings(); - CreateIndexRequest request = new CreateIndexRequest(resultIndex).mapping(mapping, XContentType.JSON); - if (alias != null) { - request.alias(new Alias(CommonName.ANOMALY_RESULT_INDEX_ALIAS)); - } - choosePrimaryShards(request, hiddenIndex); - if (AD_RESULT_HISTORY_INDEX_PATTERN.equals(resultIndex)) { - adminClient.indices().create(request, markMappingUpToDate(ADIndex.RESULT, actionListener)); + Map asMap = XContentHelper.convertToMap(new BytesArray(resultMapping), false, XContentType.JSON).v2(); + Object properties = asMap.get(CommonName.PROPERTIES); + if (properties instanceof Map) { + RESULT_FIELD_CONFIGS = (Map) properties; } else { - adminClient.indices().create(request, actionListener); + logger.error("Fail to read result mapping file."); } } /** - * Create anomaly detector job index. - * - * @param actionListener action called after create index + * Check if custom result index has correct index mapping. + * @param resultIndex result index + * @return true if result index mapping is valid */ - public void initAnomalyDetectorJobIndex(ActionListener actionListener) { + public boolean isValidResultIndexMapping(String resultIndex) { try { - CreateIndexRequest request = new CreateIndexRequest(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX) - .mapping(getAnomalyDetectorJobMappings(), XContentType.JSON); - request - .settings( - Settings - .builder() - // AD job index is small. 1 primary shard is enough - .put(IndexMetadata.SETTING_NUMBER_OF_SHARDS, 1) - // Job scheduler puts both primary and replica shards in the - // hash ring. Auto-expand the number of replicas based on the - // number of data nodes (up to 20) in the cluster so that each node can - // become a coordinating node. This is useful when customers - // scale out their cluster so that we can do adaptive scaling - // accordingly. - // At least 1 replica for fail-over. - .put(IndexMetadata.SETTING_AUTO_EXPAND_REPLICAS, minJobIndexReplicas + "-" + maxJobIndexReplicas) - .put("index.hidden", true) - ); - adminClient.indices().create(request, markMappingUpToDate(ADIndex.JOB, actionListener)); - } catch (IOException e) { - logger.error("Fail to init AD job index", e); - actionListener.onFailure(e); - } - } + initResultMapping(); + if (RESULT_FIELD_CONFIGS == null) { + // failed to populate the field + return false; + } + IndexMetadata indexMetadata = clusterService.state().metadata().index(resultIndex); + Map indexMapping = indexMetadata.mapping().sourceAsMap(); + String propertyName = CommonName.PROPERTIES; + if (!indexMapping.containsKey(propertyName) || !(indexMapping.get(propertyName) instanceof LinkedHashMap)) { + return false; + } + LinkedHashMap mapping = (LinkedHashMap) indexMapping.get(propertyName); - /** - * Create the state index. - * - * @param actionListener action called after create index - */ - public void initDetectionStateIndex(ActionListener actionListener) { - try { - CreateIndexRequest request = new CreateIndexRequest(CommonName.DETECTION_STATE_INDEX) - .mapping(getDetectionStateMappings(), XContentType.JSON) - .settings(settings); - adminClient.indices().create(request, markMappingUpToDate(ADIndex.STATE, actionListener)); - } catch (IOException e) { - logger.error("Fail to init AD detection state index", e); - actionListener.onFailure(e); + boolean correctResultIndexMapping = true; + + for (String fieldName : RESULT_FIELD_CONFIGS.keySet()) { + Object defaultSchema = RESULT_FIELD_CONFIGS.get(fieldName); + // the field might be a map or map of map + // example: map: {type=date, format=strict_date_time||epoch_millis} + // map of map: {type=nested, properties={likelihood={type=double}, value_list={type=nested, properties={data={type=double}, + // feature_id={type=keyword}}}}} + // if it is a map of map, Object.equals can compare them regardless of order + if (!mapping.containsKey(fieldName) || !defaultSchema.equals(mapping.get(fieldName))) { + correctResultIndexMapping = false; + break; + } + } + return correctResultIndexMapping; + } catch (Exception e) { + logger.error("Failed to validate result index mapping for index " + resultIndex, e); + return false; } + } /** - * Create the checkpoint index. + * Create forecast result index if not exist. * * @param actionListener action called after create index - * @throws EndRunException EndRunException due to failure to get mapping */ - public void initCheckpointIndex(ActionListener actionListener) { - String mapping; - try { - mapping = getCheckpointMappings(); - } catch (IOException e) { - throw new EndRunException("", "Cannot find checkpoint mapping file", true); - } - CreateIndexRequest request = new CreateIndexRequest(CommonName.CHECKPOINT_INDEX_NAME).mapping(mapping, XContentType.JSON); - choosePrimaryShards(request); - adminClient.indices().create(request, markMappingUpToDate(ADIndex.CHECKPOINT, actionListener)); - } - - @Override - public void onClusterManager() { - try { - // try to rollover immediately as we might be restarting the cluster - rolloverAndDeleteHistoryIndex(); - - // schedule the next rollover for approx MAX_AGE later - scheduledRollover = threadPool - .scheduleWithFixedDelay(() -> rolloverAndDeleteHistoryIndex(), historyRolloverPeriod, executorName()); - } catch (Exception e) { - // This should be run on cluster startup - logger.error("Error rollover AD result indices. " + "Can't rollover AD result until clusterManager node is restarted.", e); - } - } - - @Override - public void offClusterManager() { - if (scheduledRollover != null) { - scheduledRollover.cancel(); + public void initDefaultResultIndexIfAbsent(ActionListener actionListener) { + if (!doesDefaultResultIndexExist()) { + initDefaultResultIndexDirectly(actionListener); } } - private String executorName() { - return ThreadPool.Names.MANAGEMENT; - } - - private void rescheduleRollover() { - if (clusterService.state().getNodes().isLocalNodeElectedClusterManager()) { - if (scheduledRollover != null) { - scheduledRollover.cancel(); + protected ActionListener markMappingUpToDate( + IndexType index, + ActionListener followingListener + ) { + return ActionListener.wrap(createdResponse -> { + if (createdResponse.isAcknowledged()) { + IndexState indexStatetate = indexStates.computeIfAbsent(index, k -> new IndexState(k.getMapping())); + if (Boolean.FALSE.equals(indexStatetate.mappingUpToDate)) { + indexStatetate.mappingUpToDate = Boolean.TRUE; + logger.info(new ParameterizedMessage("Mark [{}]'s mapping up-to-date", index.getIndexName())); + } } - scheduledRollover = threadPool - .scheduleWithFixedDelay(() -> rolloverAndDeleteHistoryIndex(), historyRolloverPeriod, executorName()); - } + followingListener.onResponse(createdResponse); + }, exception -> followingListener.onFailure(exception)); } - void rolloverAndDeleteHistoryIndex() { - if (!doesDefaultAnomalyResultIndexExist()) { + protected void rolloverAndDeleteHistoryIndex( + String resultIndexAlias, + String allResultIndicesPattern, + String rolloverIndexPattern, + IndexType resultIndex + ) { + if (!doesDefaultResultIndexExist()) { return; } // We have to pass null for newIndexName in order to get Elastic to increment the index count. - RolloverRequest rollOverRequest = new RolloverRequest(CommonName.ANOMALY_RESULT_INDEX_ALIAS, null); - String adResultMapping = null; - try { - adResultMapping = getAnomalyResultMappings(); - } catch (IOException e) { - logger.error("Fail to roll over AD result index, as can't get AD result index mapping"); - return; - } + RolloverRequest rollOverRequest = new RolloverRequest(resultIndexAlias, null); + CreateIndexRequest createRequest = rollOverRequest.getCreateIndexRequest(); - createRequest.index(AD_RESULT_HISTORY_INDEX_PATTERN).mapping(adResultMapping, XContentType.JSON); + createRequest.index(rolloverIndexPattern).mapping(resultMapping, XContentType.JSON); - choosePrimaryShards(createRequest); + choosePrimaryShards(createRequest, true); rollOverRequest.addMaxIndexDocsCondition(historyMaxDocs * getNumberOfPrimaryShards()); adminClient.indices().rolloverIndex(rollOverRequest, ActionListener.wrap(response -> { if (!response.isRolledOver()) { - logger - .warn("{} not rolled over. Conditions were: {}", CommonName.ANOMALY_RESULT_INDEX_ALIAS, response.getConditionStatus()); + logger.warn("{} not rolled over. Conditions were: {}", resultIndexAlias, response.getConditionStatus()); } else { - IndexState indexStatetate = indexStates.computeIfAbsent(ADIndex.RESULT, IndexState::new); + IndexState indexStatetate = indexStates.computeIfAbsent(resultIndex, k -> new IndexState(k.getMapping())); indexStatetate.mappingUpToDate = true; - logger.info("{} rolled over. Conditions were: {}", CommonName.ANOMALY_RESULT_INDEX_ALIAS, response.getConditionStatus()); - deleteOldHistoryIndices(); + logger.info("{} rolled over. Conditions were: {}", resultIndexAlias, response.getConditionStatus()); + deleteOldHistoryIndices(allResultIndicesPattern, historyRetentionPeriod); } }, exception -> { logger.error("Fail to roll over result index", exception); })); } - void deleteOldHistoryIndices() { - Set candidates = new HashSet(); + protected void initResultIndexDirectly( + String resultIndexName, + String alias, + boolean hiddenIndex, + String resultIndexPattern, + IndexType resultIndex, + ActionListener actionListener + ) { + CreateIndexRequest request = new CreateIndexRequest(resultIndexName).mapping(resultMapping, XContentType.JSON); + if (alias != null) { + request.alias(new Alias(alias)); + } + choosePrimaryShards(request, hiddenIndex); + if (resultIndexPattern.equals(resultIndexName)) { + adminClient.indices().create(request, markMappingUpToDate(resultIndex, actionListener)); + } else { + adminClient.indices().create(request, actionListener); + } + } - ClusterStateRequest clusterStateRequest = new ClusterStateRequest() - .clear() - .indices(AnomalyDetectionIndices.ALL_AD_RESULTS_INDEX_PATTERN) - .metadata(true) - .local(true) - .indicesOptions(IndicesOptions.strictExpand()); + public abstract boolean doesCheckpointIndexExist(); - adminClient.cluster().state(clusterStateRequest, ActionListener.wrap(clusterStateResponse -> { - String latestToDelete = null; - long latest = Long.MIN_VALUE; - for (IndexMetadata indexMetaData : clusterStateResponse.getState().metadata().indices().values()) { - long creationTime = indexMetaData.getCreationDate(); + public abstract void initCheckpointIndex(ActionListener actionListener); - if ((Instant.now().toEpochMilli() - creationTime) > historyRetentionPeriod.millis()) { - String indexName = indexMetaData.getIndex().getName(); - candidates.add(indexName); - if (latest < creationTime) { - latest = creationTime; - latestToDelete = indexName; - } - } - } + public abstract boolean doesDefaultResultIndexExist(); - if (candidates.size() > 1) { - // delete all indices except the last one because the last one may contain docs newer than the retention period - candidates.remove(latestToDelete); - String[] toDelete = candidates.toArray(Strings.EMPTY_ARRAY); - DeleteIndexRequest deleteIndexRequest = new DeleteIndexRequest(toDelete); - adminClient.indices().delete(deleteIndexRequest, ActionListener.wrap(deleteIndexResponse -> { - if (!deleteIndexResponse.isAcknowledged()) { - logger - .error( - "Could not delete one or more Anomaly result indices: {}. Retrying one by one.", - Arrays.toString(toDelete) - ); - deleteIndexIteration(toDelete); - } else { - logger.info("Succeeded in deleting expired anomaly result indices: {}.", Arrays.toString(toDelete)); - } - }, exception -> { - logger.error("Failed to delete expired anomaly result indices: {}.", Arrays.toString(toDelete)); - deleteIndexIteration(toDelete); - })); - } - }, exception -> { logger.error("Fail to delete result indices", exception); })); - } + public abstract boolean doesStateIndexExist(); - private void deleteIndexIteration(String[] toDelete) { - for (String index : toDelete) { - DeleteIndexRequest singleDeleteRequest = new DeleteIndexRequest(index); - adminClient.indices().delete(singleDeleteRequest, ActionListener.wrap(singleDeleteResponse -> { - if (!singleDeleteResponse.isAcknowledged()) { - logger.error("Retrying deleting {} does not succeed.", index); - } - }, exception -> { - if (exception instanceof IndexNotFoundException) { - logger.info("{} was already deleted.", index); - } else { - logger.error(new ParameterizedMessage("Retrying deleting {} does not succeed.", index), exception); - } - })); - } - } - - public void update() { - if ((allMappingUpdated && allSettingUpdated) || updateRunningTimes >= maxUpdateRunningTimes || updateRunning.get()) { - return; - } - updateRunning.set(true); - updateRunningTimes++; - - // set updateRunning to false when both updateMappingIfNecessary and updateSettingIfNecessary - // stop running - final GroupedActionListener groupListeneer = new GroupedActionListener<>( - ActionListener.wrap(r -> updateRunning.set(false), exception -> { - updateRunning.set(false); - logger.error("Fail to update AD indices", exception); - }), - // 2 since we need both updateMappingIfNecessary and updateSettingIfNecessary to return - // before setting updateRunning to false - 2 - ); - - updateMappingIfNecessary(groupListeneer); - updateSettingIfNecessary(groupListeneer); - } - - private void updateSettingIfNecessary(GroupedActionListener delegateListeneer) { - if (allSettingUpdated) { - delegateListeneer.onResponse(null); - return; - } - - List updates = new ArrayList<>(); - for (ADIndex index : ADIndex.values()) { - Boolean updated = indexStates.computeIfAbsent(index, IndexState::new).settingUpToDate; - if (Boolean.FALSE.equals(updated)) { - updates.add(index); - } - } - if (updates.size() == 0) { - allSettingUpdated = true; - delegateListeneer.onResponse(null); - return; - } - - final GroupedActionListener conglomerateListeneer = new GroupedActionListener<>( - ActionListener.wrap(r -> delegateListeneer.onResponse(null), exception -> { - delegateListeneer.onResponse(null); - logger.error("Fail to update AD indices' mappings", exception); - }), - updates.size() - ); - for (ADIndex adIndex : updates) { - logger.info(new ParameterizedMessage("Check [{}]'s setting", adIndex.getIndexName())); - switch (adIndex) { - case JOB: - updateJobIndexSettingIfNecessary(indexStates.computeIfAbsent(adIndex, IndexState::new), conglomerateListeneer); - break; - default: - // we don't have settings to update for other indices - IndexState indexState = indexStates.computeIfAbsent(adIndex, IndexState::new); - indexState.settingUpToDate = true; - logger.info(new ParameterizedMessage("Mark [{}]'s setting up-to-date", adIndex.getIndexName())); - conglomerateListeneer.onResponse(null); - break; - } - - } - } - - /** - * Update mapping if schema version changes. - */ - private void updateMappingIfNecessary(GroupedActionListener delegateListeneer) { - if (allMappingUpdated) { - delegateListeneer.onResponse(null); - return; - } - - List updates = new ArrayList<>(); - for (ADIndex index : ADIndex.values()) { - Boolean updated = indexStates.computeIfAbsent(index, IndexState::new).mappingUpToDate; - if (Boolean.FALSE.equals(updated)) { - updates.add(index); - } - } - if (updates.size() == 0) { - allMappingUpdated = true; - delegateListeneer.onResponse(null); - return; - } + public abstract void initDefaultResultIndexDirectly(ActionListener actionListener); - final GroupedActionListener conglomerateListeneer = new GroupedActionListener<>( - ActionListener.wrap(r -> delegateListeneer.onResponse(null), exception -> { - delegateListeneer.onResponse(null); - logger.error("Fail to update AD indices' mappings", exception); - }), - updates.size() - ); - - for (ADIndex adIndex : updates) { - logger.info(new ParameterizedMessage("Check [{}]'s mapping", adIndex.getIndexName())); - shouldUpdateIndex(adIndex, ActionListener.wrap(shouldUpdate -> { - if (shouldUpdate) { - adminClient - .indices() - .putMapping( - new PutMappingRequest().indices(adIndex.getIndexName()).source(adIndex.getMapping(), XContentType.JSON), - ActionListener.wrap(putMappingResponse -> { - if (putMappingResponse.isAcknowledged()) { - logger.info(new ParameterizedMessage("Succeeded in updating [{}]'s mapping", adIndex.getIndexName())); - markMappingUpdated(adIndex); - } else { - logger.error(new ParameterizedMessage("Fail to update [{}]'s mapping", adIndex.getIndexName())); - } - conglomerateListeneer.onResponse(null); - }, exception -> { - logger - .error( - new ParameterizedMessage( - "Fail to update [{}]'s mapping due to [{}]", - adIndex.getIndexName(), - exception.getMessage() - ) - ); - conglomerateListeneer.onFailure(exception); - }) - ); - } else { - // index does not exist or the version is already up-to-date. - // When creating index, new mappings will be used. - // We don't need to update it. - logger.info(new ParameterizedMessage("We don't need to update [{}]'s mapping", adIndex.getIndexName())); - markMappingUpdated(adIndex); - conglomerateListeneer.onResponse(null); - } - }, exception -> { - logger - .error( - new ParameterizedMessage("Fail to check whether we should update [{}]'s mapping", adIndex.getIndexName()), - exception - ); - conglomerateListeneer.onFailure(exception); - })); + protected abstract IndexRequest createDummyIndexRequest(String resultIndex) throws IOException; - } - } + protected abstract DeleteRequest createDummyDeleteRequest(String resultIndex) throws IOException; - private void markMappingUpdated(ADIndex adIndex) { - IndexState indexState = indexStates.computeIfAbsent(adIndex, IndexState::new); - if (Boolean.FALSE.equals(indexState.mappingUpToDate)) { - indexState.mappingUpToDate = Boolean.TRUE; - logger.info(new ParameterizedMessage("Mark [{}]'s mapping up-to-date", adIndex.getIndexName())); - } - } + protected abstract void rolloverAndDeleteHistoryIndex(); - private void shouldUpdateIndex(ADIndex index, ActionListener thenDo) { - boolean exists = false; - if (index.isAlias()) { - exists = AnomalyDetectionIndices.doesAliasExists(clusterService, index.getIndexName()); - } else { - exists = AnomalyDetectionIndices.doesIndexExists(clusterService, index.getIndexName()); - } - if (false == exists) { - thenDo.onResponse(Boolean.FALSE); - return; - } + public abstract void initCustomResultIndexDirectly(String resultIndex, ActionListener actionListener); - Integer newVersion = indexStates.computeIfAbsent(index, IndexState::new).schemaVersion; - if (index.isAlias()) { - GetAliasesRequest getAliasRequest = new GetAliasesRequest() - .aliases(index.getIndexName()) - .indicesOptions(IndicesOptions.lenientExpandOpenHidden()); - adminClient.indices().getAliases(getAliasRequest, ActionListener.wrap(getAliasResponse -> { - String concreteIndex = null; - for (Map.Entry> entry : getAliasResponse.getAliases().entrySet()) { - if (false == entry.getValue().isEmpty()) { - // we assume the alias map to one concrete index, thus we can return after finding one - concreteIndex = entry.getKey(); - break; - } - } - if (concreteIndex == null) { - thenDo.onResponse(Boolean.FALSE); - return; - } - shouldUpdateConcreteIndex(concreteIndex, newVersion, thenDo); - }, exception -> logger.error(new ParameterizedMessage("Fail to get [{}]'s alias", index.getIndexName()), exception))); - } else { - shouldUpdateConcreteIndex(index.getIndexName(), newVersion, thenDo); - } - } - - @SuppressWarnings("unchecked") - private void shouldUpdateConcreteIndex(String concreteIndex, Integer newVersion, ActionListener thenDo) { - IndexMetadata indexMeataData = clusterService.state().getMetadata().indices().get(concreteIndex); - if (indexMeataData == null) { - thenDo.onResponse(Boolean.FALSE); - return; - } - Integer oldVersion = CommonValue.NO_SCHEMA_VERSION; - - Map indexMapping = indexMeataData.mapping().getSourceAsMap(); - Object meta = indexMapping.get(META); - if (meta != null && meta instanceof Map) { - Map metaMapping = (Map) meta; - Object schemaVersion = metaMapping.get(CommonName.SCHEMA_VERSION_FIELD); - if (schemaVersion instanceof Integer) { - oldVersion = (Integer) schemaVersion; - } - } - thenDo.onResponse(newVersion > oldVersion); - } - - private static Integer parseSchemaVersion(String mapping) { - try { - XContentParser xcp = XContentType.JSON - .xContent() - .createParser(NamedXContentRegistry.EMPTY, LoggingDeprecationHandler.INSTANCE, mapping); - - while (!xcp.isClosed()) { - Token token = xcp.currentToken(); - if (token != null && token != XContentParser.Token.END_OBJECT && token != XContentParser.Token.START_OBJECT) { - if (xcp.currentName() != META) { - xcp.nextToken(); - xcp.skipChildren(); - } else { - while (xcp.nextToken() != XContentParser.Token.END_OBJECT) { - if (xcp.currentName().equals(SCHEMA_VERSION)) { - - Integer version = xcp.intValue(); - if (version < 0) { - version = CommonValue.NO_SCHEMA_VERSION; - } - return version; - } else { - xcp.nextToken(); - } - } - - } - } - xcp.nextToken(); - } - return CommonValue.NO_SCHEMA_VERSION; - } catch (Exception e) { - // since this method is called in the constructor that is called by AnomalyDetectorPlugin.createComponents, - // we cannot throw checked exception - throw new RuntimeException(e); - } - } - - /** - * - * @param index Index metadata - * @return The schema version of the given Index - */ - public int getSchemaVersion(ADIndex index) { - IndexState indexState = this.indexStates.computeIfAbsent(index, IndexState::new); - return indexState.schemaVersion; - } - - private void updateJobIndexSettingIfNecessary(IndexState jobIndexState, ActionListener listener) { - GetSettingsRequest getSettingsRequest = new GetSettingsRequest() - .indices(ADIndex.JOB.getIndexName()) - .names( - new String[] { - IndexMetadata.SETTING_NUMBER_OF_SHARDS, - IndexMetadata.SETTING_NUMBER_OF_REPLICAS, - IndexMetadata.SETTING_AUTO_EXPAND_REPLICAS } - ); - client.execute(GetSettingsAction.INSTANCE, getSettingsRequest, ActionListener.wrap(settingResponse -> { - // auto expand setting is a range string like "1-all" - String autoExpandReplica = getStringSetting(settingResponse, IndexMetadata.SETTING_AUTO_EXPAND_REPLICAS); - // if the auto expand setting is already there, return immediately - if (autoExpandReplica != null) { - jobIndexState.settingUpToDate = true; - logger.info(new ParameterizedMessage("Mark [{}]'s mapping up-to-date", ADIndex.JOB.getIndexName())); - listener.onResponse(null); - return; - } - Integer primaryShardsNumber = getIntegerSetting(settingResponse, IndexMetadata.SETTING_NUMBER_OF_SHARDS); - Integer replicaNumber = getIntegerSetting(settingResponse, IndexMetadata.SETTING_NUMBER_OF_REPLICAS); - if (primaryShardsNumber == null || replicaNumber == null) { - logger - .error( - new ParameterizedMessage( - "Fail to find AD job index's primary or replica shard number: primary [{}], replica [{}]", - primaryShardsNumber, - replicaNumber - ) - ); - // don't throw exception as we don't know how to handle it and retry next time - listener.onResponse(null); - return; - } - // at least minJobIndexReplicas - // at most maxJobIndexReplicas / primaryShardsNumber replicas. - // For example, if we have 2 primary shards, since the max number of shards are maxJobIndexReplicas (20), - // we will use 20 / 2 = 10 replicas as the upper bound of replica. - int maxExpectedReplicas = Math.max(maxJobIndexReplicas / primaryShardsNumber, minJobIndexReplicas); - Settings updatedSettings = Settings - .builder() - .put(IndexMetadata.SETTING_AUTO_EXPAND_REPLICAS, minJobIndexReplicas + "-" + maxExpectedReplicas) - .build(); - final UpdateSettingsRequest updateSettingsRequest = new UpdateSettingsRequest(ADIndex.JOB.getIndexName()) - .settings(updatedSettings); - client.admin().indices().updateSettings(updateSettingsRequest, ActionListener.wrap(response -> { - jobIndexState.settingUpToDate = true; - logger.info(new ParameterizedMessage("Mark [{}]'s mapping up-to-date", ADIndex.JOB.getIndexName())); - listener.onResponse(null); - }, listener::onFailure)); - }, e -> { - if (e instanceof IndexNotFoundException) { - // new index will be created with auto expand replica setting - jobIndexState.settingUpToDate = true; - logger.info(new ParameterizedMessage("Mark [{}]'s mapping up-to-date", ADIndex.JOB.getIndexName())); - listener.onResponse(null); - } else { - listener.onFailure(e); - } - })); - } - - private static Integer getIntegerSetting(GetSettingsResponse settingsResponse, String settingKey) { - Integer value = null; - for (Settings settings : settingsResponse.getIndexToSettings().values()) { - value = settings.getAsInt(settingKey, null); - if (value != null) { - break; - } - } - return value; - } - - private static String getStringSetting(GetSettingsResponse settingsResponse, String settingKey) { - String value = null; - for (Settings settings : settingsResponse.getIndexToSettings().values()) { - value = settings.get(settingKey, null); - if (value != null) { - break; - } - } - return value; - } + public abstract void initStateIndex(ActionListener actionListener); } diff --git a/src/main/java/org/opensearch/timeseries/indices/TimeSeriesIndex.java b/src/main/java/org/opensearch/timeseries/indices/TimeSeriesIndex.java new file mode 100644 index 000000000..e7364ed32 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/indices/TimeSeriesIndex.java @@ -0,0 +1,22 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.timeseries.indices; + +public interface TimeSeriesIndex { + public String getIndexName(); + + public boolean isAlias(); + + public String getMapping(); + + public boolean isJobIndex(); +} diff --git a/src/main/java/org/opensearch/timeseries/ml/IntermediateResult.java b/src/main/java/org/opensearch/timeseries/ml/IntermediateResult.java new file mode 100644 index 000000000..9a8704842 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/ml/IntermediateResult.java @@ -0,0 +1,86 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.timeseries.ml; + +import java.time.Instant; +import java.util.List; +import java.util.Objects; +import java.util.Optional; + +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.FeatureData; +import org.opensearch.timeseries.model.IndexableResult; + +public abstract class IntermediateResult { + protected final long totalUpdates; + protected final double rcfScore; + + public IntermediateResult(long totalUpdates, double rcfScore) { + this.totalUpdates = totalUpdates; + this.rcfScore = rcfScore; + } + + public long getTotalUpdates() { + return totalUpdates; + } + + public double getRcfScore() { + return rcfScore; + } + + @Override + public int hashCode() { + return Objects.hash(totalUpdates); + } + + @Override + public boolean equals(Object obj) { + if (this == obj) + return true; + if (obj == null) + return false; + if (getClass() != obj.getClass()) + return false; + IntermediateResult other = (IntermediateResult) obj; + return totalUpdates == other.totalUpdates && Double.doubleToLongBits(rcfScore) == Double.doubleToLongBits(other.rcfScore); + } + + /** + * convert intermediateResult into 1+ indexable results. + * @param config Config accessor + * @param dataStartInstant data start time + * @param dataEndInstant data end time + * @param executionStartInstant execution start time + * @param executionEndInstant execution end time + * @param featureData feature data + * @param entity entity info + * @param schemaVersion schema version + * @param modelId Model id + * @param taskId Task id + * @param error Error + * @return 1+ indexable results + */ + public abstract List toIndexableResults( + Config config, + Instant dataStartInstant, + Instant dataEndInstant, + Instant executionStartInstant, + Instant executionEndInstant, + List featureData, + Optional entity, + Integer schemaVersion, + String modelId, + String taskId, + String error + ); +} diff --git a/src/main/java/org/opensearch/ad/ml/SingleStreamModelIdMapper.java b/src/main/java/org/opensearch/timeseries/ml/SingleStreamModelIdMapper.java similarity index 98% rename from src/main/java/org/opensearch/ad/ml/SingleStreamModelIdMapper.java rename to src/main/java/org/opensearch/timeseries/ml/SingleStreamModelIdMapper.java index ac3ce899d..c33c4818f 100644 --- a/src/main/java/org/opensearch/ad/ml/SingleStreamModelIdMapper.java +++ b/src/main/java/org/opensearch/timeseries/ml/SingleStreamModelIdMapper.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.ml; +package org.opensearch.timeseries.ml; import java.util.Locale; import java.util.regex.Matcher; diff --git a/src/main/java/org/opensearch/timeseries/model/Config.java b/src/main/java/org/opensearch/timeseries/model/Config.java new file mode 100644 index 000000000..15f67d116 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/model/Config.java @@ -0,0 +1,575 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.model; + +import static org.opensearch.timeseries.constant.CommonMessages.INVALID_CHAR_IN_RESULT_INDEX_NAME; + +import java.io.IOException; +import java.time.Duration; +import java.time.Instant; +import java.util.List; +import java.util.Map; +import java.util.stream.Collectors; + +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; +import org.apache.logging.log4j.util.Strings; +import org.opensearch.ad.model.AnomalyDetector; +import org.opensearch.commons.authuser.User; +import org.opensearch.core.common.io.stream.StreamInput; +import org.opensearch.core.common.io.stream.StreamOutput; +import org.opensearch.core.common.io.stream.Writeable; +import org.opensearch.core.xcontent.ToXContentObject; +import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.forecast.model.Forecaster; +import org.opensearch.index.query.QueryBuilder; +import org.opensearch.timeseries.annotation.Generated; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.common.exception.ValidationException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.dataprocessor.FixedValueImputer; +import org.opensearch.timeseries.dataprocessor.ImputationMethod; +import org.opensearch.timeseries.dataprocessor.ImputationOption; +import org.opensearch.timeseries.dataprocessor.Imputer; +import org.opensearch.timeseries.dataprocessor.LinearUniformImputer; +import org.opensearch.timeseries.dataprocessor.PreviousValueImputer; +import org.opensearch.timeseries.dataprocessor.ZeroImputer; +import org.opensearch.timeseries.settings.TimeSeriesSettings; + +import com.google.common.base.Objects; +import com.google.common.collect.ImmutableList; + +public abstract class Config implements Writeable, ToXContentObject { + private static final Logger logger = LogManager.getLogger(Config.class); + + public static final int MAX_RESULT_INDEX_NAME_SIZE = 255; + // OS doesn’t allow uppercase: https://tinyurl.com/yse2xdbx + public static final String RESULT_INDEX_NAME_PATTERN = "[a-z0-9_-]+"; + + public static final String NO_ID = ""; + public static final String TIMEOUT = "timeout"; + public static final String GENERAL_SETTINGS = "general_settings"; + public static final String AGGREGATION = "aggregation_issue"; + + // field in JSON representation + public static final String NAME_FIELD = "name"; + public static final String DESCRIPTION_FIELD = "description"; + public static final String TIMEFIELD_FIELD = "time_field"; + public static final String INDICES_FIELD = "indices"; + public static final String UI_METADATA_FIELD = "ui_metadata"; + public static final String FILTER_QUERY_FIELD = "filter_query"; + public static final String FEATURE_ATTRIBUTES_FIELD = "feature_attributes"; + public static final String WINDOW_DELAY_FIELD = "window_delay"; + public static final String SHINGLE_SIZE_FIELD = "shingle_size"; + public static final String LAST_UPDATE_TIME_FIELD = "last_update_time"; + public static final String CATEGORY_FIELD = "category_field"; + public static final String USER_FIELD = "user"; + public static final String RESULT_INDEX_FIELD = "result_index"; + public static final String IMPUTATION_OPTION_FIELD = "imputation_option"; + + private static final Imputer zeroImputer; + private static final Imputer previousImputer; + private static final Imputer linearImputer; + private static final Imputer linearImputerIntegerSensitive; + + protected String id; + protected Long version; + protected String name; + protected String description; + protected String timeField; + protected List indices; + protected List featureAttributes; + protected QueryBuilder filterQuery; + protected TimeConfiguration interval; + protected TimeConfiguration windowDelay; + protected Integer shingleSize; + protected String customResultIndex; + protected Map uiMetadata; + protected Integer schemaVersion; + protected Instant lastUpdateTime; + protected List categoryFields; + protected User user; + protected ImputationOption imputationOption; + + // validation error + protected String errorMessage; + protected ValidationIssueType issueType; + + protected Imputer imputer; + + public static String INVALID_RESULT_INDEX_NAME_SIZE = "Result index name size must contains less than " + + MAX_RESULT_INDEX_NAME_SIZE + + " characters"; + + static { + zeroImputer = new ZeroImputer(); + previousImputer = new PreviousValueImputer(); + linearImputer = new LinearUniformImputer(false); + linearImputerIntegerSensitive = new LinearUniformImputer(true); + } + + protected Config( + String id, + Long version, + String name, + String description, + String timeField, + List indices, + List features, + QueryBuilder filterQuery, + TimeConfiguration windowDelay, + Integer shingleSize, + Map uiMetadata, + Integer schemaVersion, + Instant lastUpdateTime, + List categoryFields, + User user, + String resultIndex, + TimeConfiguration interval, + ImputationOption imputationOption + ) { + if (Strings.isBlank(name)) { + errorMessage = CommonMessages.EMPTY_NAME; + issueType = ValidationIssueType.NAME; + return; + } + if (Strings.isBlank(timeField)) { + errorMessage = CommonMessages.NULL_TIME_FIELD; + issueType = ValidationIssueType.TIMEFIELD_FIELD; + return; + } + if (indices == null || indices.isEmpty()) { + errorMessage = CommonMessages.EMPTY_INDICES; + issueType = ValidationIssueType.INDICES; + return; + } + + if (invalidShingleSizeRange(shingleSize)) { + errorMessage = "Shingle size must be a positive integer no larger than " + + TimeSeriesSettings.MAX_SHINGLE_SIZE + + ". Got " + + shingleSize; + issueType = ValidationIssueType.SHINGLE_SIZE_FIELD; + return; + } + + errorMessage = validateCustomResultIndex(resultIndex); + if (errorMessage != null) { + issueType = ValidationIssueType.RESULT_INDEX; + return; + } + + if (imputationOption != null + && imputationOption.getMethod() == ImputationMethod.FIXED_VALUES + && imputationOption.getDefaultFill().isEmpty()) { + issueType = ValidationIssueType.IMPUTATION; + errorMessage = "No given values for fixed value interpolation"; + return; + } + + this.id = id; + this.version = version; + this.name = name; + this.description = description; + this.timeField = timeField; + this.indices = indices; + this.featureAttributes = features == null ? ImmutableList.of() : ImmutableList.copyOf(features); + this.filterQuery = filterQuery; + this.interval = interval; + this.windowDelay = windowDelay; + this.shingleSize = getShingleSize(shingleSize); + this.uiMetadata = uiMetadata; + this.schemaVersion = schemaVersion; + this.lastUpdateTime = lastUpdateTime; + this.categoryFields = categoryFields; + this.user = user; + this.customResultIndex = Strings.trimToNull(resultIndex); + this.imputationOption = imputationOption; + this.imputer = createImputer(); + this.issueType = null; + this.errorMessage = null; + } + + public Config(StreamInput input) throws IOException { + id = input.readOptionalString(); + version = input.readOptionalLong(); + name = input.readString(); + description = input.readOptionalString(); + timeField = input.readString(); + indices = input.readStringList(); + featureAttributes = input.readList(Feature::new); + filterQuery = input.readNamedWriteable(QueryBuilder.class); + interval = IntervalTimeConfiguration.readFrom(input); + windowDelay = IntervalTimeConfiguration.readFrom(input); + shingleSize = input.readInt(); + schemaVersion = input.readInt(); + this.categoryFields = input.readOptionalStringList(); + lastUpdateTime = input.readInstant(); + if (input.readBoolean()) { + this.user = new User(input); + } else { + user = null; + } + if (input.readBoolean()) { + this.uiMetadata = input.readMap(); + } else { + this.uiMetadata = null; + } + customResultIndex = input.readOptionalString(); + if (input.readBoolean()) { + this.imputationOption = new ImputationOption(input); + } else { + this.imputationOption = null; + } + this.imputer = createImputer(); + } + + /* + * Implicit constructor that be called implicitly when a subtype + * needs to call like AnomalyDetector(StreamInput). Otherwise, + * we will have compiler error: + * "Implicit super constructor Config() is undefined. + * Must explicitly invoke another constructor". + */ + public Config() { + this.imputer = null; + } + + @Override + public void writeTo(StreamOutput output) throws IOException { + output.writeOptionalString(id); + output.writeOptionalLong(version); + output.writeString(name); + output.writeOptionalString(description); + output.writeString(timeField); + output.writeStringCollection(indices); + output.writeList(featureAttributes); + output.writeNamedWriteable(filterQuery); + interval.writeTo(output); + windowDelay.writeTo(output); + output.writeInt(shingleSize); + output.writeInt(schemaVersion); + output.writeOptionalStringCollection(categoryFields); + output.writeInstant(lastUpdateTime); + if (user != null) { + output.writeBoolean(true); // user exists + user.writeTo(output); + } else { + output.writeBoolean(false); // user does not exist + } + if (uiMetadata != null) { + output.writeBoolean(true); + output.writeMap(uiMetadata); + } else { + output.writeBoolean(false); + } + output.writeOptionalString(customResultIndex); + if (imputationOption != null) { + output.writeBoolean(true); + imputationOption.writeTo(output); + } else { + output.writeBoolean(false); + } + } + + /** + * If the given shingle size is null, return default; + * otherwise, return the given shingle size. + * + * @param customShingleSize Given shingle size + * @return Shingle size + */ + protected static Integer getShingleSize(Integer customShingleSize) { + return customShingleSize == null ? TimeSeriesSettings.DEFAULT_SHINGLE_SIZE : customShingleSize; + } + + public boolean invalidShingleSizeRange(Integer shingleSizeToTest) { + return shingleSizeToTest != null && (shingleSizeToTest < 1 || shingleSizeToTest > TimeSeriesSettings.MAX_SHINGLE_SIZE); + } + + /** + * + * @return either ValidationAspect.FORECASTER or ValidationAspect.DETECTOR + * depending on this is a forecaster or detector config. + */ + protected abstract ValidationAspect getConfigValidationAspect(); + + @Generated + @Override + public boolean equals(Object o) { + if (this == o) + return true; + if (o == null || getClass() != o.getClass()) + return false; + Config config = (Config) o; + // a few fields not included: + // 1)didn't include uiMetadata since toXContent/parse will produce a map of map + // and cause the parsed one not equal to the original one. This can be confusing. + // 2)didn't include id, schemaVersion, and lastUpdateTime as we deemed equality based on contents. + // Including id fails tests like AnomalyDetectorExecutionInput.testParseAnomalyDetectorExecutionInput. + return Objects.equal(name, config.name) + && Objects.equal(description, config.description) + && Objects.equal(timeField, config.timeField) + && Objects.equal(indices, config.indices) + && Objects.equal(featureAttributes, config.featureAttributes) + && Objects.equal(filterQuery, config.filterQuery) + && Objects.equal(interval, config.interval) + && Objects.equal(windowDelay, config.windowDelay) + && Objects.equal(shingleSize, config.shingleSize) + && Objects.equal(categoryFields, config.categoryFields) + && Objects.equal(user, config.user) + && Objects.equal(customResultIndex, config.customResultIndex) + && Objects.equal(imputationOption, config.imputationOption); + } + + @Generated + @Override + public int hashCode() { + return Objects + .hashCode( + name, + description, + timeField, + indices, + featureAttributes, + filterQuery, + interval, + windowDelay, + shingleSize, + categoryFields, + schemaVersion, + user, + customResultIndex, + imputationOption + ); + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder + .field(NAME_FIELD, name) + .field(DESCRIPTION_FIELD, description) + .field(TIMEFIELD_FIELD, timeField) + .field(INDICES_FIELD, indices.toArray()) + .field(FILTER_QUERY_FIELD, filterQuery) + .field(WINDOW_DELAY_FIELD, windowDelay) + .field(SHINGLE_SIZE_FIELD, shingleSize) + .field(CommonName.SCHEMA_VERSION_FIELD, schemaVersion) + .field(FEATURE_ATTRIBUTES_FIELD, featureAttributes.toArray()); + + if (uiMetadata != null && !uiMetadata.isEmpty()) { + builder.field(UI_METADATA_FIELD, uiMetadata); + } + if (lastUpdateTime != null) { + builder.field(LAST_UPDATE_TIME_FIELD, lastUpdateTime.toEpochMilli()); + } + if (categoryFields != null) { + builder.field(CATEGORY_FIELD, categoryFields.toArray()); + } + if (user != null) { + builder.field(USER_FIELD, user); + } + if (customResultIndex != null) { + builder.field(RESULT_INDEX_FIELD, customResultIndex); + } + if (imputationOption != null) { + builder.field(IMPUTATION_OPTION_FIELD, imputationOption); + } + return builder; + } + + public Long getVersion() { + return version; + } + + public String getName() { + return name; + } + + public String getDescription() { + return description; + } + + public String getTimeField() { + return timeField; + } + + public List getIndices() { + return indices; + } + + public List getFeatureAttributes() { + return featureAttributes; + } + + public QueryBuilder getFilterQuery() { + return filterQuery; + } + + /** + * Returns enabled feature ids in the same order in feature attributes. + * + * @return a list of filtered feature ids. + */ + public List getEnabledFeatureIds() { + return featureAttributes.stream().filter(Feature::getEnabled).map(Feature::getId).collect(Collectors.toList()); + } + + public List getEnabledFeatureNames() { + return featureAttributes.stream().filter(Feature::getEnabled).map(Feature::getName).collect(Collectors.toList()); + } + + public TimeConfiguration getInterval() { + return interval; + } + + public TimeConfiguration getWindowDelay() { + return windowDelay; + } + + public Integer getShingleSize() { + return shingleSize; + } + + public Map getUiMetadata() { + return uiMetadata; + } + + public Integer getSchemaVersion() { + return schemaVersion; + } + + public Instant getLastUpdateTime() { + return lastUpdateTime; + } + + public List getCategoryFields() { + return this.categoryFields; + } + + public String getId() { + return id; + } + + public long getIntervalInMilliseconds() { + return ((IntervalTimeConfiguration) getInterval()).toDuration().toMillis(); + } + + public long getIntervalInSeconds() { + return getIntervalInMilliseconds() / 1000; + } + + public long getIntervalInMinutes() { + return getIntervalInMilliseconds() / 1000 / 60; + } + + public Duration getIntervalDuration() { + return ((IntervalTimeConfiguration) getInterval()).toDuration(); + } + + public User getUser() { + return user; + } + + public void setUser(User user) { + this.user = user; + } + + public String getCustomResultIndex() { + return customResultIndex; + } + + public boolean isHighCardinality() { + return Config.isHC(getCategoryFields()); + } + + public boolean hasMultipleCategories() { + return categoryFields != null && categoryFields.size() > 1; + } + + public String validateCustomResultIndex(String resultIndex) { + if (resultIndex == null) { + return null; + } + if (resultIndex.length() > MAX_RESULT_INDEX_NAME_SIZE) { + return Config.INVALID_RESULT_INDEX_NAME_SIZE; + } + if (!resultIndex.matches(RESULT_INDEX_NAME_PATTERN)) { + return INVALID_CHAR_IN_RESULT_INDEX_NAME; + } + return null; + } + + public static boolean isHC(List categoryFields) { + return categoryFields != null && categoryFields.size() > 0; + } + + public ImputationOption getImputationOption() { + return imputationOption; + } + + public Imputer getImputer() { + if (imputer != null) { + return imputer; + } + imputer = createImputer(); + return imputer; + } + + protected Imputer createImputer() { + Imputer imputer = null; + + // default interpolator is using last known value + if (imputationOption == null) { + return previousImputer; + } + + switch (imputationOption.getMethod()) { + case ZERO: + imputer = zeroImputer; + break; + case FIXED_VALUES: + // we did validate default fill is not empty in the constructor + imputer = new FixedValueImputer(imputationOption.getDefaultFill().get()); + break; + case PREVIOUS: + imputer = previousImputer; + break; + case LINEAR: + if (imputationOption.isIntegerSentive()) { + imputer = linearImputerIntegerSensitive; + } else { + imputer = linearImputer; + } + break; + default: + logger.error("unsupported method: " + imputationOption.getMethod()); + imputer = new PreviousValueImputer(); + break; + } + return imputer; + } + + protected void checkAndThrowValidationErrors(ValidationAspect validationAspect) { + if (errorMessage != null && issueType != null) { + throw new ValidationException(errorMessage, issueType, validationAspect); + } else if (errorMessage != null || issueType != null) { + throw new TimeSeriesException(CommonMessages.FAIL_TO_VALIDATE); + } + } + + public static Config parseConfig(Class configClass, XContentParser parser) throws IOException { + if (configClass == AnomalyDetector.class) { + return AnomalyDetector.parse(parser); + } else if (configClass == Forecaster.class) { + return Forecaster.parse(parser); + } else { + throw new IllegalArgumentException("Unsupported config type. Supported config types are [AnomalyDetector, Forecaster]"); + } + } +} diff --git a/src/main/java/org/opensearch/ad/model/DataByFeatureId.java b/src/main/java/org/opensearch/timeseries/model/DataByFeatureId.java similarity index 90% rename from src/main/java/org/opensearch/ad/model/DataByFeatureId.java rename to src/main/java/org/opensearch/timeseries/model/DataByFeatureId.java index f3686ee53..c74679214 100644 --- a/src/main/java/org/opensearch/ad/model/DataByFeatureId.java +++ b/src/main/java/org/opensearch/timeseries/model/DataByFeatureId.java @@ -1,15 +1,9 @@ /* + * Copyright OpenSearch Contributors * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. */ -package org.opensearch.ad.model; +package org.opensearch.timeseries.model; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; @@ -18,7 +12,6 @@ import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.common.io.stream.Writeable; -import org.opensearch.core.xcontent.ToXContent.Params; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; diff --git a/src/main/java/org/opensearch/ad/model/DetectionDateRange.java b/src/main/java/org/opensearch/timeseries/model/DateRange.java similarity index 87% rename from src/main/java/org/opensearch/ad/model/DetectionDateRange.java rename to src/main/java/org/opensearch/timeseries/model/DateRange.java index cd1f6b24b..f6b99b8e5 100644 --- a/src/main/java/org/opensearch/ad/model/DetectionDateRange.java +++ b/src/main/java/org/opensearch/timeseries/model/DateRange.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.model; +package org.opensearch.timeseries.model; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; @@ -17,18 +17,18 @@ import java.time.Instant; import org.apache.commons.lang.builder.ToStringBuilder; -import org.opensearch.ad.annotation.Generated; -import org.opensearch.ad.util.ParseUtils; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.common.io.stream.Writeable; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.annotation.Generated; +import org.opensearch.timeseries.util.ParseUtils; import com.google.common.base.Objects; -public class DetectionDateRange implements ToXContentObject, Writeable { +public class DateRange implements ToXContentObject, Writeable { public static final String START_TIME_FIELD = "start_time"; public static final String END_TIME_FIELD = "end_time"; @@ -36,13 +36,13 @@ public class DetectionDateRange implements ToXContentObject, Writeable { private final Instant startTime; private final Instant endTime; - public DetectionDateRange(Instant startTime, Instant endTime) { + public DateRange(Instant startTime, Instant endTime) { this.startTime = startTime; this.endTime = endTime; validate(); } - public DetectionDateRange(StreamInput in) throws IOException { + public DateRange(StreamInput in) throws IOException { this.startTime = in.readInstant(); this.endTime = in.readInstant(); validate(); @@ -68,7 +68,7 @@ public XContentBuilder toXContent(XContentBuilder builder, Params params) throws return xContentBuilder.endObject(); } - public static DetectionDateRange parse(XContentParser parser) throws IOException { + public static DateRange parse(XContentParser parser) throws IOException { Instant startTime = null; Instant endTime = null; @@ -89,7 +89,7 @@ public static DetectionDateRange parse(XContentParser parser) throws IOException break; } } - return new DetectionDateRange(startTime, endTime); + return new DateRange(startTime, endTime); } @Generated @@ -99,7 +99,7 @@ public boolean equals(Object o) { return true; if (o == null || getClass() != o.getClass()) return false; - DetectionDateRange that = (DetectionDateRange) o; + DateRange that = (DateRange) o; return Objects.equal(getStartTime(), that.getStartTime()) && Objects.equal(getEndTime(), that.getEndTime()); } diff --git a/src/main/java/org/opensearch/ad/model/Entity.java b/src/main/java/org/opensearch/timeseries/model/Entity.java similarity index 79% rename from src/main/java/org/opensearch/ad/model/Entity.java rename to src/main/java/org/opensearch/timeseries/model/Entity.java index 0871c8c63..f05f5dc2a 100644 --- a/src/main/java/org/opensearch/ad/model/Entity.java +++ b/src/main/java/org/opensearch/timeseries/model/Entity.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.model; +package org.opensearch.timeseries.model; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; @@ -25,9 +25,6 @@ import java.util.TreeMap; import org.apache.lucene.util.SetOnce; -import org.opensearch.ad.annotation.Generated; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.common.Numbers; import org.opensearch.common.hash.MurmurHash3; import org.opensearch.common.xcontent.LoggingDeprecationHandler; @@ -42,6 +39,8 @@ import org.opensearch.core.xcontent.XContentParser; import org.opensearch.core.xcontent.XContentParser.Token; import org.opensearch.index.query.TermQueryBuilder; +import org.opensearch.timeseries.annotation.Generated; +import org.opensearch.timeseries.constant.CommonName; import com.google.common.base.Joiner; import com.google.common.base.Objects; @@ -229,53 +228,53 @@ private static String normalizedAttributes(SortedMap attributes) } /** - * Create model Id out of detector Id and attribute name and value pairs - * - * HCAD v1 uses the categorical value as part of the model document Id, but - * OpenSearch’s document Id can be at most 512 bytes. Categorical values are - * usually less than 256 characters, but can grow to 32766 in theory. - * HCAD v1 skips an entity if the entity's name is more than 256 characters. - * We cannot do that in v2 as that can reject a lot of entities. To overcome - * the obstacle, we hash categorical values to a 128-bit string (like SHA-1 - * that git uses) and use the hash as part of the model document Id. - * - * We have choices to make regarding when to use the hash as part of a model - * document Id: for all HC detectors or a HC detector with multiple categorical - * fields. The challenge lies in providing backward compatibility of looking for - * a model checkpoint in the case of a HC detector with one categorical field. - * If using hashes for all HC detectors, we need two get requests to ensure that - * a model checkpoint exists. One uses the document Id without a hash, while one - * uses the document Id with a hash. The dual get requests are ineffective. If - * limiting hashes to a HC detector with multiple categorical fields, there is - * no backward compatibility issue. However, the code will be branchy. One may - * wonder if backward compatibility can be ignored; indeed, the old checkpoints - * will be gone after a transition period during upgrading. During the transition - * period, HC detectors can experience unnecessary cold starts as if the - * detectors were just started. Checkpoint index size can double if every model - * has two model documents. The transition period can be as long as 3 days since - * our checkpoint retention period is 3 days. There is no perfect solution. We - * prefer limiting hashes to an HC detector with multiple categorical fields as - * its customer impact is none. - * - * @param detectorId Detector Id - * @param attributes Attributes of an entity - * @return the model Id - */ - public static Optional getModelId(String detectorId, SortedMap attributes) { + * Create model Id out of config Id and the attribute name and value pairs + * + * HCAD v1 uses the categorical value as part of the model document Id, + * but OpenSearch's document Id can be at most 512 bytes. Categorical + * values are usually less than 256 characters but can grow to 32766 + * in theory. HCAD v1 skips an entity if the entity's name is more than + * 256 characters. We cannot do that in v2 as that can reject a lot of + * entities. To overcome the obstacle, we hash categorical values to a + * 128-bit string (like SHA-1 that git uses) and use the hash as part + * of the model document Id. + * + * We have choices regarding when to use the hash as part of a model + * document Id: for all HC detectors or an HC detector with multiple + * categorical fields. The challenge lies in providing backward + * compatibility by looking for a model checkpoint for an HC detector + * with one categorical field. If using hashes for all HC detectors, + * we need two get requests to ensure that a model checkpoint exists. + * One uses the document Id without a hash, while one uses the document + * Id with a hash. The dual get requests are ineffective. If limiting + * hashes to an HC detector with multiple categorical fields, there is + * no backward compatibility issue. However, the code will be branchy. + * One may wonder if backward compatibility can be ignored; indeed, + * the old checkpoints will be gone after a transition period during + * upgrading. During the transition period, the HC detector can + * experience unnecessary cold starts as if the detectors were just + * started. The checkpoint index size can double if every model has + * two model documents. The transition period can be three days since + * our checkpoint retention period is three days. + * + * There is no perfect solution. Considering that we can initialize one + * million models within 15 minutes in our performance test, we prefer + * to keep one and multiple categorical fields consistent and use hash + * only. This lifts the limitation that the categorical values cannot + * be more than 256 characters when there is one categorical field. + * Also, We will use hashes for new analyses like forecasting, regardless + * of the number of categorical fields. Using hashes always helps simplify + * our code base without worrying about whether the config is + * AnomalyDetector and when it is not. Thus, we prefer a hash-only solution + * for ease of use and maintainability. + * + * @param configId config Id + * @param attributes Attributes of an entity + * @return the model Id + */ + private static Optional getModelId(String configId, SortedMap attributes) { if (attributes.isEmpty()) { return Optional.empty(); - } else if (attributes.size() == 1) { - for (Map.Entry categoryValuePair : attributes.entrySet()) { - // For OpenSearch, the limit of the document ID is 512 bytes. - // skip an entity if the entity's name is more than 256 characters - // since we are using it as part of document id. - String categoricalValue = categoryValuePair.getValue().toString(); - if (categoricalValue.length() > AnomalyDetectorSettings.MAX_ENTITY_LENGTH) { - return Optional.empty(); - } - return Optional.of(detectorId + MODEL_ID_INFIX + categoricalValue); - } - return Optional.empty(); } else { String normalizedFields = normalizedAttributes(attributes); MurmurHash3.Hash128 hashFunc = MurmurHash3 @@ -291,20 +290,20 @@ public static Optional getModelId(String detectorId, SortedMap getModelId(String detectorId) { + public Optional getModelId(String configId) { if (modelId.get() == null) { // computing model id is not cheap and the result is deterministic. We only do it once. - Optional computedModelId = Entity.getModelId(detectorId, attributes); + Optional computedModelId = Entity.getModelId(configId, attributes); if (computedModelId.isPresent()) { this.modelId.set(computedModelId.get()); } else { diff --git a/src/main/java/org/opensearch/ad/model/Feature.java b/src/main/java/org/opensearch/timeseries/model/Feature.java similarity index 96% rename from src/main/java/org/opensearch/ad/model/Feature.java rename to src/main/java/org/opensearch/timeseries/model/Feature.java index 07d5b104f..045a6b96b 100644 --- a/src/main/java/org/opensearch/ad/model/Feature.java +++ b/src/main/java/org/opensearch/timeseries/model/Feature.java @@ -9,15 +9,13 @@ * GitHub history for details. */ -package org.opensearch.ad.model; +package org.opensearch.timeseries.model; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; import java.io.IOException; import org.apache.logging.log4j.util.Strings; -import org.opensearch.ad.annotation.Generated; -import org.opensearch.ad.util.ParseUtils; import org.opensearch.common.UUIDs; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; @@ -26,11 +24,13 @@ import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; import org.opensearch.search.aggregations.AggregationBuilder; +import org.opensearch.timeseries.annotation.Generated; +import org.opensearch.timeseries.util.ParseUtils; import com.google.common.base.Objects; /** - * Anomaly detector feature + * time series to analyze (a.k.a. feature) */ public class Feature implements Writeable, ToXContentObject { diff --git a/src/main/java/org/opensearch/ad/model/FeatureData.java b/src/main/java/org/opensearch/timeseries/model/FeatureData.java similarity index 97% rename from src/main/java/org/opensearch/ad/model/FeatureData.java rename to src/main/java/org/opensearch/timeseries/model/FeatureData.java index 55705d55d..584dfbf4f 100644 --- a/src/main/java/org/opensearch/ad/model/FeatureData.java +++ b/src/main/java/org/opensearch/timeseries/model/FeatureData.java @@ -9,17 +9,17 @@ * GitHub history for details. */ -package org.opensearch.ad.model; +package org.opensearch.timeseries.model; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; import java.io.IOException; -import org.opensearch.ad.annotation.Generated; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.annotation.Generated; import com.google.common.base.Objects; diff --git a/src/main/java/org/opensearch/timeseries/model/IndexableResult.java b/src/main/java/org/opensearch/timeseries/model/IndexableResult.java new file mode 100644 index 000000000..7ccc58b59 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/model/IndexableResult.java @@ -0,0 +1,258 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.timeseries.model; + +import java.io.IOException; +import java.time.Instant; +import java.util.ArrayList; +import java.util.List; +import java.util.Optional; + +import org.apache.commons.lang.builder.ToStringBuilder; +import org.opensearch.commons.authuser.User; +import org.opensearch.core.common.io.stream.StreamInput; +import org.opensearch.core.common.io.stream.StreamOutput; +import org.opensearch.core.common.io.stream.Writeable; +import org.opensearch.core.xcontent.ToXContentObject; +import org.opensearch.timeseries.annotation.Generated; + +import com.google.common.base.Objects; + +public abstract class IndexableResult implements Writeable, ToXContentObject { + + protected final String configId; + protected final List featureData; + protected final Instant dataStartTime; + protected final Instant dataEndTime; + protected final Instant executionStartTime; + protected final Instant executionEndTime; + protected final String error; + protected final Optional optionalEntity; + protected User user; + protected final Integer schemaVersion; + /* + * model id for easy aggregations of entities. The front end needs to query + * for entities ordered by the descending/ascending order of feature values. + * After supporting multi-category fields, it is hard to write such queries + * since the entity information is stored in a nested object array. + * Also, the front end has all code/queries/ helper functions in place to + * rely on a single key per entity combo. Adding model id to forecast result + * to help the transition to multi-categorical field less painful. + */ + protected final String modelId; + protected final String entityId; + protected final String taskId; + + public IndexableResult( + String configId, + List featureData, + Instant dataStartTime, + Instant dataEndTime, + Instant executionStartTime, + Instant executionEndTime, + String error, + Optional entity, + User user, + Integer schemaVersion, + String modelId, + String taskId + ) { + this.configId = configId; + this.featureData = featureData; + this.dataStartTime = dataStartTime; + this.dataEndTime = dataEndTime; + this.executionStartTime = executionStartTime; + this.executionEndTime = executionEndTime; + this.error = error; + this.optionalEntity = entity; + this.user = user; + this.schemaVersion = schemaVersion; + this.modelId = modelId; + this.taskId = taskId; + this.entityId = getEntityId(entity, configId); + } + + public IndexableResult(StreamInput input) throws IOException { + this.configId = input.readString(); + int featureSize = input.readVInt(); + this.featureData = new ArrayList<>(featureSize); + for (int i = 0; i < featureSize; i++) { + featureData.add(new FeatureData(input)); + } + this.dataStartTime = input.readInstant(); + this.dataEndTime = input.readInstant(); + this.executionStartTime = input.readInstant(); + this.executionEndTime = input.readInstant(); + this.error = input.readOptionalString(); + if (input.readBoolean()) { + this.optionalEntity = Optional.of(new Entity(input)); + } else { + this.optionalEntity = Optional.empty(); + } + if (input.readBoolean()) { + this.user = new User(input); + } else { + user = null; + } + this.schemaVersion = input.readInt(); + this.modelId = input.readOptionalString(); + this.taskId = input.readOptionalString(); + this.entityId = input.readOptionalString(); + } + + @Override + public void writeTo(StreamOutput out) throws IOException { + out.writeString(configId); + out.writeVInt(featureData.size()); + for (FeatureData feature : featureData) { + feature.writeTo(out); + } + out.writeInstant(dataStartTime); + out.writeInstant(dataEndTime); + out.writeInstant(executionStartTime); + out.writeInstant(executionEndTime); + out.writeOptionalString(error); + if (optionalEntity.isPresent()) { + out.writeBoolean(true); + optionalEntity.get().writeTo(out); + } else { + out.writeBoolean(false); + } + if (user != null) { + out.writeBoolean(true); // user exists + user.writeTo(out); + } else { + out.writeBoolean(false); // user does not exist + } + out.writeInt(schemaVersion); + out.writeOptionalString(modelId); + out.writeOptionalString(taskId); + out.writeOptionalString(entityId); + } + + public String getConfigId() { + return configId; + } + + public List getFeatureData() { + return featureData; + } + + public Instant getDataStartTime() { + return dataStartTime; + } + + public Instant getDataEndTime() { + return dataEndTime; + } + + public Instant getExecutionStartTime() { + return executionStartTime; + } + + public Instant getExecutionEndTime() { + return executionEndTime; + } + + public String getError() { + return error; + } + + public Optional getEntity() { + return optionalEntity; + } + + public String getModelId() { + return modelId; + } + + public String getTaskId() { + return taskId; + } + + public String getEntityId() { + return entityId; + } + + /** + * entityId equals to model Id. It is hard to explain to users what + * modelId is. entityId is more user friendly. + * @param entity Entity info + * @param configId config id + * @return entity id + */ + public static String getEntityId(Optional entity, String configId) { + return entity.flatMap(e -> e.getModelId(configId)).orElse(null); + } + + @Override + public boolean equals(Object o) { + if (this == o) + return true; + if (o == null || getClass() != o.getClass()) + return false; + IndexableResult that = (IndexableResult) o; + return Objects.equal(configId, that.configId) + && Objects.equal(taskId, that.taskId) + && Objects.equal(featureData, that.featureData) + && Objects.equal(dataStartTime, that.dataStartTime) + && Objects.equal(dataEndTime, that.dataEndTime) + && Objects.equal(executionStartTime, that.executionStartTime) + && Objects.equal(executionEndTime, that.executionEndTime) + && Objects.equal(error, that.error) + && Objects.equal(optionalEntity, that.optionalEntity) + && Objects.equal(modelId, that.modelId) + && Objects.equal(entityId, that.entityId); + } + + @Generated + @Override + public int hashCode() { + return Objects + .hashCode( + configId, + taskId, + featureData, + dataStartTime, + dataEndTime, + executionStartTime, + executionEndTime, + error, + optionalEntity, + modelId, + entityId + ); + } + + @Override + public String toString() { + return new ToStringBuilder(this) + .append("configId", configId) + .append("taskId", taskId) + .append("featureData", featureData) + .append("dataStartTime", dataStartTime) + .append("dataEndTime", dataEndTime) + .append("executionStartTime", executionStartTime) + .append("executionEndTime", executionEndTime) + .append("error", error) + .append("entity", optionalEntity) + .append("modelId", modelId) + .append("entityId", entityId) + .toString(); + } + + /** + * Used to throw away requests when index pressure is high. + * @return whether the result is high priority. + */ + public abstract boolean isHighPriority(); +} diff --git a/src/main/java/org/opensearch/ad/model/IntervalTimeConfiguration.java b/src/main/java/org/opensearch/timeseries/model/IntervalTimeConfiguration.java similarity index 86% rename from src/main/java/org/opensearch/ad/model/IntervalTimeConfiguration.java rename to src/main/java/org/opensearch/timeseries/model/IntervalTimeConfiguration.java index 92652ff5b..eaa6301df 100644 --- a/src/main/java/org/opensearch/ad/model/IntervalTimeConfiguration.java +++ b/src/main/java/org/opensearch/timeseries/model/IntervalTimeConfiguration.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.model; +package org.opensearch.timeseries.model; import java.io.IOException; import java.time.Duration; @@ -17,11 +17,11 @@ import java.util.Locale; import java.util.Set; -import org.opensearch.ad.annotation.Generated; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.common.io.stream.StreamOutput; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.annotation.Generated; import com.google.common.base.Objects; import com.google.common.collect.ImmutableSet; @@ -42,11 +42,17 @@ public class IntervalTimeConfiguration extends TimeConfiguration { public IntervalTimeConfiguration(long interval, ChronoUnit unit) { if (interval < 0) { throw new IllegalArgumentException( - String.format(Locale.ROOT, "Interval %s %s", interval, CommonErrorMessages.NEGATIVE_TIME_CONFIGURATION) + String + .format( + Locale.ROOT, + "Interval %s %s", + interval, + org.opensearch.timeseries.constant.CommonMessages.NEGATIVE_TIME_CONFIGURATION + ) ); } if (!SUPPORTED_UNITS.contains(unit)) { - throw new IllegalArgumentException(String.format(Locale.ROOT, CommonErrorMessages.INVALID_TIME_CONFIGURATION_UNITS, unit)); + throw new IllegalArgumentException(String.format(Locale.ROOT, ADCommonMessages.INVALID_TIME_CONFIGURATION_UNITS, unit)); } this.interval = interval; this.unit = unit; diff --git a/src/main/java/org/opensearch/ad/model/AnomalyDetectorJob.java b/src/main/java/org/opensearch/timeseries/model/Job.java similarity index 90% rename from src/main/java/org/opensearch/ad/model/AnomalyDetectorJob.java rename to src/main/java/org/opensearch/timeseries/model/Job.java index 360ceb049..958152e2c 100644 --- a/src/main/java/org/opensearch/ad/model/AnomalyDetectorJob.java +++ b/src/main/java/org/opensearch/timeseries/model/Job.java @@ -9,15 +9,14 @@ * GitHub history for details. */ -package org.opensearch.ad.model; +package org.opensearch.timeseries.model; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.DEFAULT_AD_JOB_LOC_DURATION_SECONDS; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; +import static org.opensearch.timeseries.settings.TimeSeriesSettings.DEFAULT_JOB_LOC_DURATION_SECONDS; import java.io.IOException; import java.time.Instant; -import org.opensearch.ad.util.ParseUtils; import org.opensearch.commons.authuser.User; import org.opensearch.core.ParseField; import org.opensearch.core.common.io.stream.StreamInput; @@ -32,13 +31,14 @@ import org.opensearch.jobscheduler.spi.schedule.IntervalSchedule; import org.opensearch.jobscheduler.spi.schedule.Schedule; import org.opensearch.jobscheduler.spi.schedule.ScheduleParser; +import org.opensearch.timeseries.util.ParseUtils; import com.google.common.base.Objects; /** * Anomaly detector job. */ -public class AnomalyDetectorJob implements Writeable, ToXContentObject, ScheduledJobParameter { +public class Job implements Writeable, ToXContentObject, ScheduledJobParameter { enum ScheduleType { CRON, INTERVAL @@ -46,12 +46,11 @@ enum ScheduleType { public static final String PARSE_FIELD_NAME = "AnomalyDetectorJob"; public static final NamedXContentRegistry.Entry XCONTENT_REGISTRY = new NamedXContentRegistry.Entry( - AnomalyDetectorJob.class, + Job.class, new ParseField(PARSE_FIELD_NAME), it -> parse(it) ); - public static final String ANOMALY_DETECTOR_JOB_INDEX = ".opendistro-anomaly-detector-jobs"; public static final String NAME_FIELD = "name"; public static final String LAST_UPDATE_TIME_FIELD = "last_update_time"; public static final String LOCK_DURATION_SECONDS = "lock_duration_seconds"; @@ -75,7 +74,7 @@ enum ScheduleType { private final User user; private String resultIndex; - public AnomalyDetectorJob( + public Job( String name, Schedule schedule, TimeConfiguration windowDelay, @@ -99,9 +98,9 @@ public AnomalyDetectorJob( this.resultIndex = resultIndex; } - public AnomalyDetectorJob(StreamInput input) throws IOException { + public Job(StreamInput input) throws IOException { name = input.readString(); - if (input.readEnum(AnomalyDetectorJob.ScheduleType.class) == ScheduleType.CRON) { + if (input.readEnum(Job.ScheduleType.class) == ScheduleType.CRON) { schedule = new CronSchedule(input); } else { schedule = new IntervalSchedule(input); @@ -167,7 +166,7 @@ public void writeTo(StreamOutput output) throws IOException { output.writeOptionalString(resultIndex); } - public static AnomalyDetectorJob parse(XContentParser parser) throws IOException { + public static Job parse(XContentParser parser) throws IOException { String name = null; Schedule schedule = null; TimeConfiguration windowDelay = null; @@ -176,7 +175,7 @@ public static AnomalyDetectorJob parse(XContentParser parser) throws IOException Instant enabledTime = null; Instant disabledTime = null; Instant lastUpdateTime = null; - Long lockDurationSeconds = DEFAULT_AD_JOB_LOC_DURATION_SECONDS; + Long lockDurationSeconds = DEFAULT_JOB_LOC_DURATION_SECONDS; User user = null; String resultIndex = null; @@ -221,7 +220,7 @@ public static AnomalyDetectorJob parse(XContentParser parser) throws IOException break; } } - return new AnomalyDetectorJob( + return new Job( name, schedule, windowDelay, @@ -241,7 +240,7 @@ public boolean equals(Object o) { return true; if (o == null || getClass() != o.getClass()) return false; - AnomalyDetectorJob that = (AnomalyDetectorJob) o; + Job that = (Job) o; return Objects.equal(getName(), that.getName()) && Objects.equal(getSchedule(), that.getSchedule()) && Objects.equal(isEnabled(), that.isEnabled()) @@ -249,7 +248,7 @@ public boolean equals(Object o) { && Objects.equal(getDisabledTime(), that.getDisabledTime()) && Objects.equal(getLastUpdateTime(), that.getLastUpdateTime()) && Objects.equal(getLockDurationSeconds(), that.getLockDurationSeconds()) - && Objects.equal(getResultIndex(), that.getResultIndex()); + && Objects.equal(getCustomResultIndex(), that.getCustomResultIndex()); } @Override @@ -299,7 +298,7 @@ public User getUser() { return user; } - public String getResultIndex() { + public String getCustomResultIndex() { return resultIndex; } } diff --git a/src/main/java/org/opensearch/ad/model/MergeableList.java b/src/main/java/org/opensearch/timeseries/model/MergeableList.java similarity index 91% rename from src/main/java/org/opensearch/ad/model/MergeableList.java rename to src/main/java/org/opensearch/timeseries/model/MergeableList.java index 4bb0d7842..fd9f26e84 100644 --- a/src/main/java/org/opensearch/ad/model/MergeableList.java +++ b/src/main/java/org/opensearch/timeseries/model/MergeableList.java @@ -9,10 +9,12 @@ * GitHub history for details. */ -package org.opensearch.ad.model; +package org.opensearch.timeseries.model; import java.util.List; +import org.opensearch.ad.model.Mergeable; + public class MergeableList implements Mergeable { private final List elements; diff --git a/src/main/java/org/opensearch/ad/model/ADTaskState.java b/src/main/java/org/opensearch/timeseries/model/TaskState.java similarity index 68% rename from src/main/java/org/opensearch/ad/model/ADTaskState.java rename to src/main/java/org/opensearch/timeseries/model/TaskState.java index 68462f816..2b5c4240e 100644 --- a/src/main/java/org/opensearch/ad/model/ADTaskState.java +++ b/src/main/java/org/opensearch/timeseries/model/TaskState.java @@ -9,47 +9,47 @@ * GitHub history for details. */ -package org.opensearch.ad.model; +package org.opensearch.timeseries.model; import java.util.List; import com.google.common.collect.ImmutableList; /** - * AD task states. + * AD and forecasting task states. *
    *
  • CREATED: - * When user start a historical detector, we will create one task to track the detector + * AD: When user start a historical detector, we will create one task to track the detector * execution and set its state as CREATED * *
  • INIT: - * After task created, coordinate node will gather all eligible node’s state and dispatch + * AD: After task created, coordinate node will gather all eligible node’s state and dispatch * task to the worker node with lowest load. When the worker node receives the request, * it will set the task state as INIT immediately, then start to run cold start to train * RCF model. We will track the initialization progress in task. * Init_Progress=ModelUpdates/MinSampleSize * *
  • RUNNING: - * If RCF model gets enough data points and passed training, it will start to detect data + * AD: If RCF model gets enough data points and passed training, it will start to detect data * normally and output positive anomaly scores. Once the RCF model starts to output positive * anomaly score, we will set the task state as RUNNING and init progress as 100%. We will * track task running progress in task: Task_Progress=DetectedPieces/AllPieces * *
  • FINISHED: - * When all historical data detected, we set the task state as FINISHED and task progress + * AD: When all historical data detected, we set the task state as FINISHED and task progress * as 100%. * *
  • STOPPED: - * User can cancel a running task by stopping detector, for example, user want to tune + * AD: User can cancel a running task by stopping detector, for example, user want to tune * feature and reran and don’t want current task run any more. When a historical detector * stopped, we will mark the task flag cancelled as true, when run next piece, we will * check this flag and stop the task. Then task stopped, will set its state as STOPPED * *
  • FAILED: - * If any exception happen, we will set task state as FAILED + * AD: If any exception happen, we will set task state as FAILED *
*/ -public enum ADTaskState { +public enum TaskState { CREATED, INIT, RUNNING, @@ -58,5 +58,5 @@ public enum ADTaskState { FINISHED; public static List NOT_ENDED_STATES = ImmutableList - .of(ADTaskState.CREATED.name(), ADTaskState.INIT.name(), ADTaskState.RUNNING.name()); + .of(TaskState.CREATED.name(), TaskState.INIT.name(), TaskState.RUNNING.name()); } diff --git a/src/main/java/org/opensearch/timeseries/model/TaskType.java b/src/main/java/org/opensearch/timeseries/model/TaskType.java new file mode 100644 index 000000000..74481871d --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/model/TaskType.java @@ -0,0 +1,17 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.model; + +import java.util.List; +import java.util.stream.Collectors; + +public interface TaskType { + String name(); + + public static List taskTypeToString(List adTaskTypes) { + return adTaskTypes.stream().map(type -> type.name()).collect(Collectors.toList()); + } +} diff --git a/src/main/java/org/opensearch/ad/model/TimeConfiguration.java b/src/main/java/org/opensearch/timeseries/model/TimeConfiguration.java similarity index 98% rename from src/main/java/org/opensearch/ad/model/TimeConfiguration.java rename to src/main/java/org/opensearch/timeseries/model/TimeConfiguration.java index 291c96eb5..d370e524e 100644 --- a/src/main/java/org/opensearch/ad/model/TimeConfiguration.java +++ b/src/main/java/org/opensearch/timeseries/model/TimeConfiguration.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.model; +package org.opensearch.timeseries.model; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; diff --git a/src/main/java/org/opensearch/timeseries/model/TimeSeriesTask.java b/src/main/java/org/opensearch/timeseries/model/TimeSeriesTask.java new file mode 100644 index 000000000..fd57de7cd --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/model/TimeSeriesTask.java @@ -0,0 +1,448 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.model; + +import static org.opensearch.timeseries.model.TaskState.NOT_ENDED_STATES; + +import java.io.IOException; +import java.time.Instant; + +import org.opensearch.commons.authuser.User; +import org.opensearch.core.common.io.stream.Writeable; +import org.opensearch.core.xcontent.ToXContentObject; +import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.annotation.Generated; + +import com.google.common.base.Objects; + +public abstract class TimeSeriesTask implements ToXContentObject, Writeable { + + public static final String TASK_ID_FIELD = "task_id"; + public static final String LAST_UPDATE_TIME_FIELD = "last_update_time"; + public static final String STARTED_BY_FIELD = "started_by"; + public static final String STOPPED_BY_FIELD = "stopped_by"; + public static final String ERROR_FIELD = "error"; + public static final String STATE_FIELD = "state"; + public static final String TASK_PROGRESS_FIELD = "task_progress"; + public static final String INIT_PROGRESS_FIELD = "init_progress"; + public static final String CURRENT_PIECE_FIELD = "current_piece"; + public static final String EXECUTION_START_TIME_FIELD = "execution_start_time"; + public static final String EXECUTION_END_TIME_FIELD = "execution_end_time"; + public static final String IS_LATEST_FIELD = "is_latest"; + public static final String TASK_TYPE_FIELD = "task_type"; + public static final String CHECKPOINT_ID_FIELD = "checkpoint_id"; + public static final String COORDINATING_NODE_FIELD = "coordinating_node"; + public static final String WORKER_NODE_FIELD = "worker_node"; + public static final String ENTITY_FIELD = "entity"; + public static final String PARENT_TASK_ID_FIELD = "parent_task_id"; + public static final String ESTIMATED_MINUTES_LEFT_FIELD = "estimated_minutes_left"; + public static final String USER_FIELD = "user"; + public static final String HISTORICAL_TASK_PREFIX = "HISTORICAL"; + + protected String configId = null; + protected String taskId = null; + protected Instant lastUpdateTime = null; + protected String startedBy = null; + protected String stoppedBy = null; + protected String error = null; + protected String state = null; + protected Float taskProgress = null; + protected Float initProgress = null; + protected Instant currentPiece = null; + protected Instant executionStartTime = null; + protected Instant executionEndTime = null; + protected Boolean isLatest = null; + protected String taskType = null; + protected String checkpointId = null; + protected String coordinatingNode = null; + protected String workerNode = null; + protected Entity entity = null; + protected String parentTaskId = null; + protected Integer estimatedMinutesLeft = null; + protected User user = null; + + @SuppressWarnings("unchecked") + public abstract static class Builder> { + protected String configId = null; + protected String taskId = null; + protected String taskType = null; + protected String state = null; + protected Float taskProgress = null; + protected Float initProgress = null; + protected Instant currentPiece = null; + protected Instant executionStartTime = null; + protected Instant executionEndTime = null; + protected Boolean isLatest = null; + protected String error = null; + protected String checkpointId = null; + protected Instant lastUpdateTime = null; + protected String startedBy = null; + protected String stoppedBy = null; + protected String coordinatingNode = null; + protected String workerNode = null; + protected Entity entity = null; + protected String parentTaskId; + protected Integer estimatedMinutesLeft; + protected User user = null; + + public Builder() {} + + public T configId(String configId) { + this.configId = configId; + return (T) this; + } + + public T taskId(String taskId) { + this.taskId = taskId; + return (T) this; + } + + public T lastUpdateTime(Instant lastUpdateTime) { + this.lastUpdateTime = lastUpdateTime; + return (T) this; + } + + public T startedBy(String startedBy) { + this.startedBy = startedBy; + return (T) this; + } + + public T stoppedBy(String stoppedBy) { + this.stoppedBy = stoppedBy; + return (T) this; + } + + public T error(String error) { + this.error = error; + return (T) this; + } + + public T state(String state) { + this.state = state; + return (T) this; + } + + public T taskProgress(Float taskProgress) { + this.taskProgress = taskProgress; + return (T) this; + } + + public T initProgress(Float initProgress) { + this.initProgress = initProgress; + return (T) this; + } + + public T currentPiece(Instant currentPiece) { + this.currentPiece = currentPiece; + return (T) this; + } + + public T executionStartTime(Instant executionStartTime) { + this.executionStartTime = executionStartTime; + return (T) this; + } + + public T executionEndTime(Instant executionEndTime) { + this.executionEndTime = executionEndTime; + return (T) this; + } + + public T isLatest(Boolean isLatest) { + this.isLatest = isLatest; + return (T) this; + } + + public T taskType(String taskType) { + this.taskType = taskType; + return (T) this; + } + + public T checkpointId(String checkpointId) { + this.checkpointId = checkpointId; + return (T) this; + } + + public T coordinatingNode(String coordinatingNode) { + this.coordinatingNode = coordinatingNode; + return (T) this; + } + + public T workerNode(String workerNode) { + this.workerNode = workerNode; + return (T) this; + } + + public T entity(Entity entity) { + this.entity = entity; + return (T) this; + } + + public T parentTaskId(String parentTaskId) { + this.parentTaskId = parentTaskId; + return (T) this; + } + + public T estimatedMinutesLeft(Integer estimatedMinutesLeft) { + this.estimatedMinutesLeft = estimatedMinutesLeft; + return (T) this; + } + + public T user(User user) { + this.user = user; + return (T) this; + } + } + + public boolean isHistoricalTask() { + return taskType.startsWith(TimeSeriesTask.HISTORICAL_TASK_PREFIX); + } + + /** + * Get config level task id. If a task has no parent task, the task is config level task. + * @return config level task id + */ + public String getConfigLevelTaskId() { + return getParentTaskId() != null ? getParentTaskId() : getTaskId(); + } + + public String getTaskId() { + return taskId; + } + + public void setTaskId(String taskId) { + this.taskId = taskId; + } + + public Instant getLastUpdateTime() { + return lastUpdateTime; + } + + public String getStartedBy() { + return startedBy; + } + + public String getStoppedBy() { + return stoppedBy; + } + + public String getError() { + return error; + } + + public void setError(String error) { + this.error = error; + } + + public String getState() { + return state; + } + + public void setState(String state) { + this.state = state; + } + + public Float getTaskProgress() { + return taskProgress; + } + + public Float getInitProgress() { + return initProgress; + } + + public Instant getCurrentPiece() { + return currentPiece; + } + + public Instant getExecutionStartTime() { + return executionStartTime; + } + + public Instant getExecutionEndTime() { + return executionEndTime; + } + + public Boolean isLatest() { + return isLatest; + } + + public String getTaskType() { + return taskType; + } + + public String getCheckpointId() { + return checkpointId; + } + + public String getCoordinatingNode() { + return coordinatingNode; + } + + public String getWorkerNode() { + return workerNode; + } + + public Entity getEntity() { + return entity; + } + + public String getParentTaskId() { + return parentTaskId; + } + + public Integer getEstimatedMinutesLeft() { + return estimatedMinutesLeft; + } + + public User getUser() { + return user; + } + + public String getConfigId() { + return configId; + } + + public void setLatest(Boolean latest) { + isLatest = latest; + } + + public void setLastUpdateTime(Instant lastUpdateTime) { + this.lastUpdateTime = lastUpdateTime; + } + + public boolean isDone() { + return !NOT_ENDED_STATES.contains(this.getState()); + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + if (taskId != null) { + builder.field(TimeSeriesTask.TASK_ID_FIELD, taskId); + } + if (lastUpdateTime != null) { + builder.field(TimeSeriesTask.LAST_UPDATE_TIME_FIELD, lastUpdateTime.toEpochMilli()); + } + if (startedBy != null) { + builder.field(TimeSeriesTask.STARTED_BY_FIELD, startedBy); + } + if (stoppedBy != null) { + builder.field(TimeSeriesTask.STOPPED_BY_FIELD, stoppedBy); + } + if (error != null) { + builder.field(TimeSeriesTask.ERROR_FIELD, error); + } + if (state != null) { + builder.field(TimeSeriesTask.STATE_FIELD, state); + } + if (taskProgress != null) { + builder.field(TimeSeriesTask.TASK_PROGRESS_FIELD, taskProgress); + } + if (initProgress != null) { + builder.field(TimeSeriesTask.INIT_PROGRESS_FIELD, initProgress); + } + if (currentPiece != null) { + builder.field(TimeSeriesTask.CURRENT_PIECE_FIELD, currentPiece.toEpochMilli()); + } + if (executionStartTime != null) { + builder.field(TimeSeriesTask.EXECUTION_START_TIME_FIELD, executionStartTime.toEpochMilli()); + } + if (executionEndTime != null) { + builder.field(TimeSeriesTask.EXECUTION_END_TIME_FIELD, executionEndTime.toEpochMilli()); + } + if (isLatest != null) { + builder.field(TimeSeriesTask.IS_LATEST_FIELD, isLatest); + } + if (taskType != null) { + builder.field(TimeSeriesTask.TASK_TYPE_FIELD, taskType); + } + if (checkpointId != null) { + builder.field(TimeSeriesTask.CHECKPOINT_ID_FIELD, checkpointId); + } + if (coordinatingNode != null) { + builder.field(TimeSeriesTask.COORDINATING_NODE_FIELD, coordinatingNode); + } + if (workerNode != null) { + builder.field(TimeSeriesTask.WORKER_NODE_FIELD, workerNode); + } + if (entity != null) { + builder.field(TimeSeriesTask.ENTITY_FIELD, entity); + } + if (parentTaskId != null) { + builder.field(TimeSeriesTask.PARENT_TASK_ID_FIELD, parentTaskId); + } + if (estimatedMinutesLeft != null) { + builder.field(TimeSeriesTask.ESTIMATED_MINUTES_LEFT_FIELD, estimatedMinutesLeft); + } + if (user != null) { + builder.field(TimeSeriesTask.USER_FIELD, user); + } + return builder; + } + + @Generated + @Override + public boolean equals(Object o) { + if (this == o) + return true; + if (o == null || getClass() != o.getClass()) + return false; + TimeSeriesTask that = (TimeSeriesTask) o; + return Objects.equal(getConfigId(), that.getConfigId()) + && Objects.equal(getTaskId(), that.getTaskId()) + && Objects.equal(getLastUpdateTime(), that.getLastUpdateTime()) + && Objects.equal(getStartedBy(), that.getStartedBy()) + && Objects.equal(getStoppedBy(), that.getStoppedBy()) + && Objects.equal(getError(), that.getError()) + && Objects.equal(getState(), that.getState()) + && Objects.equal(getTaskProgress(), that.getTaskProgress()) + && Objects.equal(getInitProgress(), that.getInitProgress()) + && Objects.equal(getCurrentPiece(), that.getCurrentPiece()) + && Objects.equal(getExecutionStartTime(), that.getExecutionStartTime()) + && Objects.equal(getExecutionEndTime(), that.getExecutionEndTime()) + && Objects.equal(isLatest(), that.isLatest()) + && Objects.equal(getTaskType(), that.getTaskType()) + && Objects.equal(getCheckpointId(), that.getCheckpointId()) + && Objects.equal(getCoordinatingNode(), that.getCoordinatingNode()) + && Objects.equal(getWorkerNode(), that.getWorkerNode()) + && Objects.equal(getEntity(), that.getEntity()) + && Objects.equal(getParentTaskId(), that.getParentTaskId()) + && Objects.equal(getEstimatedMinutesLeft(), that.getEstimatedMinutesLeft()) + && Objects.equal(getUser(), that.getUser()); + } + + @Generated + @Override + public int hashCode() { + return Objects + .hashCode( + taskId, + lastUpdateTime, + startedBy, + stoppedBy, + error, + state, + taskProgress, + initProgress, + currentPiece, + executionStartTime, + executionEndTime, + isLatest, + taskType, + checkpointId, + coordinatingNode, + workerNode, + entity, + parentTaskId, + estimatedMinutesLeft, + user + ); + } + + public abstract boolean isEntityTask(); + + public String getEntityModelId() { + return entity == null ? null : entity.getModelId(configId).orElse(null); + } +} diff --git a/src/main/java/org/opensearch/ad/model/ValidationAspect.java b/src/main/java/org/opensearch/timeseries/model/ValidationAspect.java similarity index 75% rename from src/main/java/org/opensearch/ad/model/ValidationAspect.java rename to src/main/java/org/opensearch/timeseries/model/ValidationAspect.java index a1583c875..95fbf2217 100644 --- a/src/main/java/org/opensearch/ad/model/ValidationAspect.java +++ b/src/main/java/org/opensearch/timeseries/model/ValidationAspect.java @@ -9,13 +9,15 @@ * GitHub history for details. */ -package org.opensearch.ad.model; +package org.opensearch.timeseries.model; import java.util.Collection; import java.util.Set; -import org.opensearch.ad.Name; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.forecast.constant.ForecastCommonName; +import org.opensearch.timeseries.Name; +import org.opensearch.timeseries.constant.CommonName; /** * Validation Aspect enum. There two types of validation types for validation API, @@ -28,8 +30,9 @@ * */ public enum ValidationAspect implements Name { - DETECTOR(CommonName.DETECTOR_ASPECT), - MODEL(CommonName.MODEL_ASPECT); + DETECTOR(ADCommonName.DETECTOR_ASPECT), + MODEL(CommonName.MODEL_ASPECT), + FORECASTER(ForecastCommonName.FORECASTER_ASPECT); private String name; @@ -49,10 +52,12 @@ public String getName() { public static ValidationAspect getName(String name) { switch (name) { - case CommonName.DETECTOR_ASPECT: + case ADCommonName.DETECTOR_ASPECT: return DETECTOR; case CommonName.MODEL_ASPECT: return MODEL; + case ForecastCommonName.FORECASTER_ASPECT: + return FORECASTER; default: throw new IllegalArgumentException("Unsupported validation aspects"); } diff --git a/src/main/java/org/opensearch/timeseries/model/ValidationIssueType.java b/src/main/java/org/opensearch/timeseries/model/ValidationIssueType.java new file mode 100644 index 000000000..01913a9c6 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/model/ValidationIssueType.java @@ -0,0 +1,52 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.timeseries.model; + +import org.opensearch.ad.model.AnomalyDetector; +import org.opensearch.forecast.model.Forecaster; +import org.opensearch.timeseries.Name; + +public enum ValidationIssueType implements Name { + NAME(Config.NAME_FIELD), + TIMEFIELD_FIELD(Config.TIMEFIELD_FIELD), + SHINGLE_SIZE_FIELD(Config.SHINGLE_SIZE_FIELD), + INDICES(Config.INDICES_FIELD), + FEATURE_ATTRIBUTES(Config.FEATURE_ATTRIBUTES_FIELD), + CATEGORY(Config.CATEGORY_FIELD), + FILTER_QUERY(Config.FILTER_QUERY_FIELD), + WINDOW_DELAY(Config.WINDOW_DELAY_FIELD), + GENERAL_SETTINGS(Config.GENERAL_SETTINGS), + RESULT_INDEX(Config.RESULT_INDEX_FIELD), + TIMEOUT(Config.TIMEOUT), + AGGREGATION(Config.AGGREGATION), // this is a unique case where aggregation failed due to an issue in core but + // don't want to throw exception + IMPUTATION(Config.IMPUTATION_OPTION_FIELD), + DETECTION_INTERVAL(AnomalyDetector.DETECTION_INTERVAL_FIELD), + FORECAST_INTERVAL(Forecaster.FORECAST_INTERVAL_FIELD), + HORIZON_SIZE(Forecaster.HORIZON_FIELD); + + private String name; + + ValidationIssueType(String name) { + this.name = name; + } + + /** + * Get validation type + * + * @return name + */ + @Override + public String getName() { + return name; + } +} diff --git a/src/main/java/org/opensearch/ad/settings/AbstractSetting.java b/src/main/java/org/opensearch/timeseries/settings/DynamicNumericSetting.java similarity index 84% rename from src/main/java/org/opensearch/ad/settings/AbstractSetting.java rename to src/main/java/org/opensearch/timeseries/settings/DynamicNumericSetting.java index f3cf7a9b5..c9fe72a83 100644 --- a/src/main/java/org/opensearch/ad/settings/AbstractSetting.java +++ b/src/main/java/org/opensearch/timeseries/settings/DynamicNumericSetting.java @@ -1,15 +1,9 @@ /* + * Copyright OpenSearch Contributors * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. */ -package org.opensearch.ad.settings; +package org.opensearch.timeseries.settings; import java.util.ArrayList; import java.util.List; @@ -30,8 +24,8 @@ * as the enclosing instances are not singleton (i.e. deleted after use). * */ -public abstract class AbstractSetting { - private static Logger logger = LogManager.getLogger(AbstractSetting.class); +public abstract class DynamicNumericSetting { + private static Logger logger = LogManager.getLogger(DynamicNumericSetting.class); private ClusterService clusterService; /** Latest setting value for each registered key. Thread-safe is required. */ @@ -39,7 +33,7 @@ public abstract class AbstractSetting { private final Map> settings; - protected AbstractSetting(Map> settings) { + protected DynamicNumericSetting(Map> settings) { this.settings = settings; } diff --git a/src/main/java/org/opensearch/timeseries/settings/TimeSeriesSettings.java b/src/main/java/org/opensearch/timeseries/settings/TimeSeriesSettings.java new file mode 100644 index 000000000..56bbe187a --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/settings/TimeSeriesSettings.java @@ -0,0 +1,212 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.settings; + +import java.time.Duration; + +import org.opensearch.common.settings.Setting; +import org.opensearch.common.unit.TimeValue; + +public class TimeSeriesSettings { + + // ====================================== + // Model parameters + // ====================================== + public static final int DEFAULT_SHINGLE_SIZE = 8; + + // max shingle size we have seen from external users + // the larger shingle size, the harder to fill in a complete shingle + public static final int MAX_SHINGLE_SIZE = 60; + + public static final String CONFIG_INDEX_MAPPING_FILE = "mappings/config.json"; + + public static final String JOBS_INDEX_MAPPING_FILE = "mappings/job.json"; + + // 100,000 insertions costs roughly 1KB. + public static final int DOOR_KEEPER_FOR_COLD_STARTER_MAX_INSERTION = 100_000; + + public static final double DOOR_KEEPER_FALSE_POSITIVE_RATE = 0.01; + + // clean up door keeper every 60 intervals + public static final int DOOR_KEEPER_MAINTENANCE_FREQ = 60; + + // 1 million insertion costs roughly 1 MB. + public static final int DOOR_KEEPER_FOR_CACHE_MAX_INSERTION = 1_000_000; + + // for a real-time operation, we trade off speed for memory as real time opearation + // only has to do one update/scoring per interval + public static final double REAL_TIME_BOUNDING_BOX_CACHE_RATIO = 0; + + // ====================================== + // Historical analysis + // ====================================== + public static final int MAX_BATCH_TASK_PIECE_SIZE = 10_000; + + // within an interval, how many percents are used to process requests. + // 1.0 means we use all of the detection interval to process requests. + // to ensure we don't block next interval, it is better to set it less than 1.0. + public static final float INTERVAL_RATIO_FOR_REQUESTS = 0.9f; + + public static final Duration HOURLY_MAINTENANCE = Duration.ofHours(1); + + // Maximum number of deleted tasks can keep in cache. + public static final Setting MAX_CACHED_DELETED_TASKS = Setting + .intSetting("plugins.timeseries.max_cached_deleted_tasks", 1000, 1, 10_000, Setting.Property.NodeScope, Setting.Property.Dynamic); + + // ====================================== + // Checkpoint setting + // ====================================== + // we won't accept a checkpoint larger than 30MB. Or we risk OOM. + // For reference, in RCF 1.0, the checkpoint of a RCF with 50 trees, 10 dimensions, + // 256 samples is of 3.2MB. + // In compact rcf, the same RCF is of 163KB. + // Since we allow at most 5 features, and the default shingle size is 8 and default + // tree number size is 100, we can have at most 25.6 MB in RCF 1.0. + // It is possible that cx increases the max features or shingle size, but we don't want + // to risk OOM for the flexibility. + public static final int MAX_CHECKPOINT_BYTES = 30_000_000; + + // Sets the cap on the number of buffer that can be allocated by the rcf deserialization + // buffer pool. Each buffer is of 512 bytes. Memory occupied by 20 buffers is 10.24 KB. + public static final int MAX_TOTAL_RCF_SERIALIZATION_BUFFERS = 20; + + // the size of the buffer used for rcf deserialization + public static final int SERIALIZATION_BUFFER_BYTES = 512; + + // ====================================== + // rate-limiting queue parameters + // ====================================== + /** + * CheckpointWriteRequest consists of IndexRequest (200 KB), and QueuedRequest + * fields (148 bytes, read comments of ENTITY_REQUEST_SIZE_CONSTANT). + * The total is roughly 200 KB per request. + * + * We don't want the total size exceeds 1% of the heap. + * We should have at most 1% heap / 200KB = heap / 20,000,000 + * For t3.small, 1% heap is of 10MB. The queue's size is up to + * 10^ 7 / 2.0 * 10^5 = 50 + */ + public static int CHECKPOINT_WRITE_QUEUE_SIZE_IN_BYTES = 200_000; + + /** + * ResultWriteRequest consists of index request (roughly 1KB), and QueuedRequest + * fields (148 bytes, read comments of ENTITY_REQUEST_SIZE_CONSTANT). + * Plus Java object size (12 bytes), we have roughly 1160 bytes per request + * + * We don't want the total size exceeds 1% of the heap. + * We should have at most 1% heap / 1148 = heap / 116,000 + * For t3.small, 1% heap is of 10MB. The queue's size is up to + * 10^ 7 / 1160 = 8621 + */ + public static int RESULT_WRITE_QUEUE_SIZE_IN_BYTES = 1160; + + /** + * FeatureRequest has entityName (# category fields * 256, the recommended limit + * of a keyword field length), model Id (roughly 256 bytes), and QueuedRequest + * fields including config Id(roughly 128 bytes), dataStartTimeMillis (long, + * 8 bytes), and currentFeature (16 bytes, assume two features on average). + * Plus Java object size (12 bytes), we have roughly 932 bytes per request + * assuming we have 2 categorical fields (plan to support 2 categorical fields now). + * We don't want the total size exceeds 0.1% of the heap. + * We can have at most 0.1% heap / 932 = heap / 932,000. + * For t3.small, 0.1% heap is of 1MB. The queue's size is up to + * 10^ 6 / 932 = 1072 + */ + public static int FEATURE_REQUEST_SIZE_IN_BYTES = 932; + + /** + * CheckpointMaintainRequest has model Id (roughly 256 bytes), and QueuedRequest + * fields including detector Id(roughly 128 bytes), expirationEpochMs (long, + * 8 bytes), and priority (12 bytes). + * Plus Java object size (12 bytes), we have roughly 416 bytes per request. + * We don't want the total size exceeds 0.1% of the heap. + * We can have at most 0.1% heap / 416 = heap / 416,000. + * For t3.small, 0.1% heap is of 1MB. The queue's size is up to + * 10^ 6 / 416 = 2403 + */ + public static int CHECKPOINT_MAINTAIN_REQUEST_SIZE_IN_BYTES = 416; + + public static final float MAX_QUEUED_TASKS_RATIO = 0.5f; + + public static final float MEDIUM_SEGMENT_PRUNE_RATIO = 0.1f; + + public static final float LOW_SEGMENT_PRUNE_RATIO = 0.3f; + + // expensive maintenance (e.g., queue maintenance) with 1/10000 probability + public static final int MAINTENANCE_FREQ_CONSTANT = 10000; + + public static final Duration QUEUE_MAINTENANCE = Duration.ofMinutes(10); + + // ====================================== + // ML parameters + // ====================================== + // RCF + public static final int NUM_SAMPLES_PER_TREE = 256; + + public static final int NUM_TREES = 30; + + public static final double TIME_DECAY = 0.0001; + + // If we have 32 + shingleSize (hopefully recent) values, RCF can get up and running. It will be noisy — + // there is a reason that default size is 256 (+ shingle size), but it may be more useful for people to + /// start seeing some results. + public static final int NUM_MIN_SAMPLES = 32; + + // for a batch operation, we want all of the bounding box in-place for speed + public static final double BATCH_BOUNDING_BOX_CACHE_RATIO = 1; + + // ====================================== + // Cold start setting + // ====================================== + public static int MAX_COLD_START_ROUNDS = 2; + + // Thresholding + public static final double THRESHOLD_MIN_PVALUE = 0.995; + + // ====================================== + // Cold start setting + // ====================================== + public static final Setting MAX_RETRY_FOR_UNRESPONSIVE_NODE = Setting + .intSetting("plugins.timeseries.max_retry_for_unresponsive_node", 5, 0, Setting.Property.NodeScope, Setting.Property.Dynamic); + + public static final Setting BACKOFF_MINUTES = Setting + .positiveTimeSetting( + "plugins.timeseries.backoff_minutes", + TimeValue.timeValueMinutes(15), + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + public static final Setting COOLDOWN_MINUTES = Setting + .positiveTimeSetting( + "plugins.timeseries.cooldown_minutes", + TimeValue.timeValueMinutes(5), + Setting.Property.NodeScope, + Setting.Property.Dynamic + ); + + // ====================================== + // Index setting + // ====================================== + public static int MAX_UPDATE_RETRY_TIMES = 10_000; + + // ====================================== + // JOB + // ====================================== + public static final long DEFAULT_JOB_LOC_DURATION_SECONDS = 60; + + // ====================================== + // stats/profile API setting + // ====================================== + // profile API needs to report total entities. We can use cardinality aggregation for a single-category field. + // But we cannot do that for multi-category fields as it requires scripting to generate run time fields, + // which is expensive. We work around the problem by using a composite query to find the first 10_000 buckets. + // Generally, traversing all buckets/combinations can't be done without visiting all matches, which is costly + // for data with many entities. Given that it is often enough to have a lower bound of the number of entities, + // such as "there are at least 10000 entities", the default is set to 10,000. That is, requests will count the + // total entities up to 10,000. + public static final int MAX_TOTAL_ENTITIES_TO_TRACK = 10_000; +} diff --git a/src/main/java/org/opensearch/ad/stats/StatNames.java b/src/main/java/org/opensearch/timeseries/stats/StatNames.java similarity index 98% rename from src/main/java/org/opensearch/ad/stats/StatNames.java rename to src/main/java/org/opensearch/timeseries/stats/StatNames.java index 6b595dd54..a72e3f1b0 100644 --- a/src/main/java/org/opensearch/ad/stats/StatNames.java +++ b/src/main/java/org/opensearch/timeseries/stats/StatNames.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.stats; +package org.opensearch.timeseries.stats; import java.util.HashSet; import java.util.Set; diff --git a/src/main/java/org/opensearch/ad/task/ADRealtimeTaskCache.java b/src/main/java/org/opensearch/timeseries/task/RealtimeTaskCache.java similarity index 89% rename from src/main/java/org/opensearch/ad/task/ADRealtimeTaskCache.java rename to src/main/java/org/opensearch/timeseries/task/RealtimeTaskCache.java index bf8cbb860..5fe0c3850 100644 --- a/src/main/java/org/opensearch/ad/task/ADRealtimeTaskCache.java +++ b/src/main/java/org/opensearch/timeseries/task/RealtimeTaskCache.java @@ -9,19 +9,19 @@ * GitHub history for details. */ -package org.opensearch.ad.task; +package org.opensearch.timeseries.task; import java.time.Instant; /** - * AD realtime task cache which will hold these data + * realtime task cache which will hold these data * 1. task state * 2. init progress * 3. error * 4. last job run time - * 5. detector interval + * 5. analysis interval */ -public class ADRealtimeTaskCache { +public class RealtimeTaskCache { // task state private String state; @@ -42,7 +42,7 @@ public class ADRealtimeTaskCache { // To avoid repeated query when there is no data, record whether we have done that or not. private boolean queriedResultIndex; - public ADRealtimeTaskCache(String state, Float initProgress, String error, long detectorIntervalInMillis) { + public RealtimeTaskCache(String state, Float initProgress, String error, long detectorIntervalInMillis) { this.state = state; this.initProgress = initProgress; this.error = error; diff --git a/src/main/java/org/opensearch/timeseries/task/TaskCacheManager.java b/src/main/java/org/opensearch/timeseries/task/TaskCacheManager.java new file mode 100644 index 000000000..fe08f94c8 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/task/TaskCacheManager.java @@ -0,0 +1,251 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.task; + +import static org.opensearch.timeseries.settings.TimeSeriesSettings.MAX_CACHED_DELETED_TASKS; + +import java.time.Instant; +import java.util.Map; +import java.util.Queue; +import java.util.concurrent.ConcurrentHashMap; +import java.util.concurrent.ConcurrentLinkedQueue; + +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; +import org.opensearch.cluster.service.ClusterService; +import org.opensearch.common.settings.Settings; +import org.opensearch.timeseries.model.TaskState; + +public class TaskCacheManager { + private final Logger logger = LogManager.getLogger(TaskCacheManager.class); + /** + * This field is to cache all realtime tasks on coordinating node. + *

Node: coordinating node

+ *

Key is config id

+ */ + private Map realtimeTaskCaches; + + /** + * This field is to cache all deleted config level tasks on coordinating node. + * Will try to clean up child task and result later. + *

Node: coordinating node

+ * Check {@link ForecastTaskManager#cleanChildTasksAndResultsOfDeletedTask()} + */ + private Queue deletedTasks; + + protected volatile Integer maxCachedDeletedTask; + /** + * This field is to cache deleted detector IDs. Hourly cron will poll this queue + * and clean AD results. Check ADTaskManager#cleanResultOfDeletedConfig() + *

Node: any data node servers delete detector request

+ */ + protected Queue deletedConfigs; + + public TaskCacheManager(Settings settings, ClusterService clusterService) { + this.realtimeTaskCaches = new ConcurrentHashMap<>(); + this.deletedTasks = new ConcurrentLinkedQueue<>(); + this.maxCachedDeletedTask = MAX_CACHED_DELETED_TASKS.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(MAX_CACHED_DELETED_TASKS, it -> maxCachedDeletedTask = it); + this.deletedConfigs = new ConcurrentLinkedQueue<>(); + } + + public RealtimeTaskCache getRealtimeTaskCache(String configId) { + return realtimeTaskCaches.get(configId); + } + + public void initRealtimeTaskCache(String configId, long configIntervalInMillis) { + realtimeTaskCaches.put(configId, new RealtimeTaskCache(null, null, null, configIntervalInMillis)); + logger.debug("Realtime task cache inited"); + } + + /** + * Add deleted task's id to deleted tasks queue. + * @param taskId task id + */ + public void addDeletedTask(String taskId) { + if (deletedTasks.size() < maxCachedDeletedTask) { + deletedTasks.add(taskId); + } + } + + /** + * Check if deleted task queue has items. + * @return true if has deleted task in cache + */ + public boolean hasDeletedTask() { + return !deletedTasks.isEmpty(); + } + + /** + * Poll one deleted task. + * @return task id + */ + public String pollDeletedTask() { + return this.deletedTasks.poll(); + } + + /** + * Clear realtime task cache. + */ + public void clearRealtimeTaskCache() { + realtimeTaskCaches.clear(); + } + + /** + * Check if realtime task field value change needed or not by comparing with cache. + * 1. If new field value is null, will consider changed needed to this field. + * 2. will consider the real time task change needed if + * 1) init progress is larger or the old init progress is null, or + * 2) if the state is different, and it is not changing from running to init. + * for other fields, as long as field values changed, will consider the realtime + * task change needed. We did this so that the init progress or state won't go backwards. + * 3. If realtime task cache not found, will consider the realtime task change needed. + * + * @param detectorId detector id + * @param newState new task state + * @param newInitProgress new init progress + * @param newError new error + * @return true if realtime task change needed. + */ + public boolean isRealtimeTaskChangeNeeded(String detectorId, String newState, Float newInitProgress, String newError) { + if (realtimeTaskCaches.containsKey(detectorId)) { + RealtimeTaskCache realtimeTaskCache = realtimeTaskCaches.get(detectorId); + boolean stateChangeNeeded = false; + String oldState = realtimeTaskCache.getState(); + if (newState != null + && !newState.equals(oldState) + && !(TaskState.INIT.name().equals(newState) && TaskState.RUNNING.name().equals(oldState))) { + stateChangeNeeded = true; + } + boolean initProgressChangeNeeded = false; + Float existingProgress = realtimeTaskCache.getInitProgress(); + if (newInitProgress != null + && !newInitProgress.equals(existingProgress) + && (existingProgress == null || newInitProgress > existingProgress)) { + initProgressChangeNeeded = true; + } + boolean errorChanged = false; + if (newError != null && !newError.equals(realtimeTaskCache.getError())) { + errorChanged = true; + } + if (stateChangeNeeded || initProgressChangeNeeded || errorChanged) { + return true; + } + return false; + } else { + return true; + } + } + + /** + * Update realtime task cache with new field values. If realtime task cache exist, update it + * directly if task is not done; if task is done, remove the detector's realtime task cache. + * + * If realtime task cache doesn't exist, will do nothing. Next realtime job run will re-init + * realtime task cache when it finds task cache not inited yet. + * Check ADTaskManager#initCacheWithCleanupIfRequired(String, AnomalyDetector, TransportService, ActionListener), + * ADTaskManager#updateLatestRealtimeTaskOnCoordinatingNode(String, String, Long, Long, String, ActionListener) + * + * @param detectorId detector id + * @param newState new task state + * @param newInitProgress new init progress + * @param newError new error + */ + public void updateRealtimeTaskCache(String detectorId, String newState, Float newInitProgress, String newError) { + RealtimeTaskCache realtimeTaskCache = realtimeTaskCaches.get(detectorId); + if (realtimeTaskCache != null) { + if (newState != null) { + realtimeTaskCache.setState(newState); + } + if (newInitProgress != null) { + realtimeTaskCache.setInitProgress(newInitProgress); + } + if (newError != null) { + realtimeTaskCache.setError(newError); + } + if (newState != null && !TaskState.NOT_ENDED_STATES.contains(newState)) { + // If task is done, will remove its realtime task cache. + logger.info("Realtime task done with state {}, remove RT task cache for detector ", newState, detectorId); + removeRealtimeTaskCache(detectorId); + } + } else { + logger.debug("Realtime task cache is not inited yet for detector {}", detectorId); + } + } + + public void refreshRealtimeJobRunTime(String detectorId) { + RealtimeTaskCache taskCache = realtimeTaskCaches.get(detectorId); + if (taskCache != null) { + taskCache.setLastJobRunTime(Instant.now().toEpochMilli()); + } + } + + /** + * Get detector IDs from realtime task cache. + * @return array of detector id + */ + public String[] getDetectorIdsInRealtimeTaskCache() { + return realtimeTaskCaches.keySet().toArray(new String[0]); + } + + /** + * Remove detector's realtime task from cache. + * @param detectorId detector id + */ + public void removeRealtimeTaskCache(String detectorId) { + if (realtimeTaskCaches.containsKey(detectorId)) { + logger.info("Delete realtime cache for detector {}", detectorId); + realtimeTaskCaches.remove(detectorId); + } + } + + /** + * We query result index to check if there are any result generated for detector to tell whether it passed initialization of not. + * To avoid repeated query when there is no data, record whether we have done that or not. + * @param id detector id + */ + public void markResultIndexQueried(String id) { + RealtimeTaskCache realtimeTaskCache = realtimeTaskCaches.get(id); + // we initialize a real time cache at the beginning of AnomalyResultTransportAction if it + // cannot be found. If the cache is empty, we will return early and wait it for it to be + // initialized. + if (realtimeTaskCache != null) { + realtimeTaskCache.setQueriedResultIndex(true); + } + } + + /** + * We query result index to check if there are any result generated for detector to tell whether it passed initialization of not. + * + * @param id detector id + * @return whether we have queried result index or not. + */ + public boolean hasQueriedResultIndex(String id) { + RealtimeTaskCache realtimeTaskCache = realtimeTaskCaches.get(id); + if (realtimeTaskCache != null) { + return realtimeTaskCache.hasQueriedResultIndex(); + } + return false; + } + + /** + * Add deleted config's id to deleted config queue. + * @param configId config id + */ + public void addDeletedConfig(String configId) { + if (deletedConfigs.size() < maxCachedDeletedTask) { + deletedConfigs.add(configId); + } + } + + /** + * Poll one deleted config. + * @return config id + */ + public String pollDeletedConfig() { + return this.deletedConfigs.poll(); + } +} diff --git a/src/main/java/org/opensearch/ad/transport/BackPressureRouting.java b/src/main/java/org/opensearch/timeseries/transport/BackPressureRouting.java similarity index 98% rename from src/main/java/org/opensearch/ad/transport/BackPressureRouting.java rename to src/main/java/org/opensearch/timeseries/transport/BackPressureRouting.java index e5f4ba9b8..bfec0fe95 100644 --- a/src/main/java/org/opensearch/ad/transport/BackPressureRouting.java +++ b/src/main/java/org/opensearch/timeseries/transport/BackPressureRouting.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.transport; +package org.opensearch.timeseries.transport; import java.time.Clock; import java.util.concurrent.atomic.AtomicInteger; diff --git a/src/main/java/org/opensearch/timeseries/transport/JobResponse.java b/src/main/java/org/opensearch/timeseries/transport/JobResponse.java new file mode 100644 index 000000000..faa7df2c8 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/transport/JobResponse.java @@ -0,0 +1,48 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.timeseries.transport; + +import java.io.IOException; + +import org.opensearch.core.action.ActionResponse; +import org.opensearch.core.common.io.stream.StreamInput; +import org.opensearch.core.common.io.stream.StreamOutput; +import org.opensearch.core.xcontent.ToXContentObject; +import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.util.RestHandlerUtils; + +public class JobResponse extends ActionResponse implements ToXContentObject { + private final String id; + + public JobResponse(StreamInput in) throws IOException { + super(in); + id = in.readString(); + } + + public JobResponse(String id) { + this.id = id; + } + + public String getId() { + return id; + } + + @Override + public void writeTo(StreamOutput out) throws IOException { + out.writeString(id); + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + return builder.startObject().field(RestHandlerUtils._ID, id).endObject(); + } +} diff --git a/src/main/java/org/opensearch/timeseries/util/ClientUtil.java b/src/main/java/org/opensearch/timeseries/util/ClientUtil.java new file mode 100644 index 000000000..394065b2c --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/util/ClientUtil.java @@ -0,0 +1,68 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.timeseries.util; + +import java.util.function.BiConsumer; + +import org.opensearch.action.ActionRequest; +import org.opensearch.action.ActionType; +import org.opensearch.client.Client; +import org.opensearch.common.inject.Inject; +import org.opensearch.core.action.ActionListener; +import org.opensearch.core.action.ActionResponse; + +public class ClientUtil { + private Client client; + + @Inject + public ClientUtil(Client client) { + this.client = client; + } + + /** + * Send an asynchronous request and handle response with the provided listener. + * @param ActionRequest + * @param ActionResponse + * @param request request body + * @param consumer request method, functional interface to operate as a client request like client::get + * @param listener needed to handle response + */ + public void asyncRequest( + Request request, + BiConsumer> consumer, + ActionListener listener + ) { + consumer + .accept( + request, + ActionListener.wrap(response -> { listener.onResponse(response); }, exception -> { listener.onFailure(exception); }) + ); + } + + /** + * Execute a transport action and handle response with the provided listener. + * @param ActionRequest + * @param ActionResponse + * @param action transport action + * @param request request body + * @param listener needed to handle response + */ + public void execute( + ActionType action, + Request request, + ActionListener listener + ) { + client.execute(action, request, ActionListener.wrap(response -> { listener.onResponse(response); }, exception -> { + listener.onFailure(exception); + })); + } +} diff --git a/src/main/java/org/opensearch/timeseries/util/DataUtil.java b/src/main/java/org/opensearch/timeseries/util/DataUtil.java new file mode 100644 index 000000000..4f417e4f7 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/util/DataUtil.java @@ -0,0 +1,48 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.util; + +import java.util.Arrays; + +public class DataUtil { + /** + * Removes leading rows in a 2D array that contain Double.NaN values. + * + * This method iterates over the rows of the provided 2D array. If a row is found + * where all elements are not Double.NaN, it removes this row and all rows before it + * from the array. The modified array, which may be smaller than the original, is then returned. + * + * Note: If all rows contain at least one Double.NaN, the method will return an empty array. + * + * @param arr The 2D array from which leading rows containing Double.NaN are to be removed. + * @return A possibly smaller 2D array with leading rows containing Double.NaN removed. + */ + public static double[][] ltrim(double[][] arr) { + int numRows = arr.length; + if (numRows == 0) { + return new double[0][0]; + } + + int numCols = arr[0].length; + int startIndex = numRows; // Initialized to numRows + for (int i = 0; i < numRows; i++) { + boolean hasNaN = false; + for (int j = 0; j < numCols; j++) { + if (Double.isNaN(arr[i][j])) { + hasNaN = true; + break; + } + } + if (!hasNaN) { + startIndex = i; + break; // Stop the loop as soon as a row without NaN is found + } + } + + return Arrays.copyOfRange(arr, startIndex, arr.length); + } + +} diff --git a/src/main/java/org/opensearch/ad/util/DiscoveryNodeFilterer.java b/src/main/java/org/opensearch/timeseries/util/DiscoveryNodeFilterer.java similarity index 92% rename from src/main/java/org/opensearch/ad/util/DiscoveryNodeFilterer.java rename to src/main/java/org/opensearch/timeseries/util/DiscoveryNodeFilterer.java index ed42a4199..ca3ba4eba 100644 --- a/src/main/java/org/opensearch/ad/util/DiscoveryNodeFilterer.java +++ b/src/main/java/org/opensearch/timeseries/util/DiscoveryNodeFilterer.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.util; +package org.opensearch.timeseries.util; import java.util.ArrayList; import java.util.List; @@ -17,7 +17,7 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.cluster.ClusterState; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.service.ClusterService; @@ -91,8 +91,8 @@ public boolean test(DiscoveryNode discoveryNode) { return discoveryNode.isDataNode() && discoveryNode .getAttributes() - .getOrDefault(CommonName.BOX_TYPE_KEY, CommonName.HOT_BOX_TYPE) - .equals(CommonName.HOT_BOX_TYPE); + .getOrDefault(ADCommonName.BOX_TYPE_KEY, ADCommonName.HOT_BOX_TYPE) + .equals(ADCommonName.HOT_BOX_TYPE); } } } diff --git a/src/main/java/org/opensearch/ad/util/ExceptionUtil.java b/src/main/java/org/opensearch/timeseries/util/ExceptionUtil.java similarity index 95% rename from src/main/java/org/opensearch/ad/util/ExceptionUtil.java rename to src/main/java/org/opensearch/timeseries/util/ExceptionUtil.java index 36f22ffa4..e4b7c751e 100644 --- a/src/main/java/org/opensearch/ad/util/ExceptionUtil.java +++ b/src/main/java/org/opensearch/timeseries/util/ExceptionUtil.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.util; +package org.opensearch.timeseries.util; import java.util.EnumSet; import java.util.concurrent.RejectedExecutionException; @@ -22,14 +22,14 @@ import org.opensearch.action.UnavailableShardsException; import org.opensearch.action.index.IndexResponse; import org.opensearch.action.support.replication.ReplicationResponse; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.common.exception.LimitExceededException; import org.opensearch.core.action.ActionListener; import org.opensearch.core.common.io.stream.NotSerializableExceptionWrapper; import org.opensearch.core.concurrency.OpenSearchRejectedExecutionException; import org.opensearch.core.rest.RestStatus; import org.opensearch.index.IndexNotFoundException; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; public class ExceptionUtil { // a positive cache of retriable error rest status @@ -93,7 +93,7 @@ public static String getShardsFailure(IndexResponse response) { * @return true if should count in AD failure stats; otherwise return false */ public static boolean countInStats(Exception e) { - if (!(e instanceof AnomalyDetectionException) || ((AnomalyDetectionException) e).isCountedInStats()) { + if (!(e instanceof TimeSeriesException) || ((TimeSeriesException) e).isCountedInStats()) { return true; } return false; @@ -106,7 +106,7 @@ public static boolean countInStats(Exception e) { * @return readable error message or full stack trace */ public static String getErrorMessage(Exception e) { - if (e instanceof IllegalArgumentException || e instanceof AnomalyDetectionException) { + if (e instanceof IllegalArgumentException || e instanceof TimeSeriesException) { return e.getMessage(); } else if (e instanceof OpenSearchException) { return ((OpenSearchException) e).getDetailedMessage(); diff --git a/src/main/java/org/opensearch/ad/util/MultiResponsesDelegateActionListener.java b/src/main/java/org/opensearch/timeseries/util/MultiResponsesDelegateActionListener.java similarity index 98% rename from src/main/java/org/opensearch/ad/util/MultiResponsesDelegateActionListener.java rename to src/main/java/org/opensearch/timeseries/util/MultiResponsesDelegateActionListener.java index e83488330..5d0998d27 100644 --- a/src/main/java/org/opensearch/ad/util/MultiResponsesDelegateActionListener.java +++ b/src/main/java/org/opensearch/timeseries/util/MultiResponsesDelegateActionListener.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.util; +package org.opensearch.timeseries.util; import java.util.ArrayList; import java.util.Collections; diff --git a/src/main/java/org/opensearch/ad/util/ParseUtils.java b/src/main/java/org/opensearch/timeseries/util/ParseUtils.java similarity index 80% rename from src/main/java/org/opensearch/ad/util/ParseUtils.java rename to src/main/java/org/opensearch/timeseries/util/ParseUtils.java index e6ddd2e4f..0978a0de5 100644 --- a/src/main/java/org/opensearch/ad/util/ParseUtils.java +++ b/src/main/java/org/opensearch/timeseries/util/ParseUtils.java @@ -9,20 +9,14 @@ * GitHub history for details. */ -package org.opensearch.ad.util; - -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_FIND_DETECTOR_MSG; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_GET_USER_INFO; -import static org.opensearch.ad.constant.CommonErrorMessages.NO_PERMISSION_TO_ACCESS_DETECTOR; -import static org.opensearch.ad.constant.CommonName.DATE_HISTOGRAM; -import static org.opensearch.ad.constant.CommonName.EPOCH_MILLIS_FORMAT; -import static org.opensearch.ad.constant.CommonName.FEATURE_AGGS; -import static org.opensearch.ad.model.AnomalyDetector.QUERY_PARAM_PERIOD_END; -import static org.opensearch.ad.model.AnomalyDetector.QUERY_PARAM_PERIOD_START; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_BATCH_TASK_PIECE_SIZE; +package org.opensearch.timeseries.util; + +import static org.opensearch.ad.constant.ADCommonName.EPOCH_MILLIS_FORMAT; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; import static org.opensearch.search.aggregations.AggregationBuilders.dateRange; import static org.opensearch.search.aggregations.AggregatorFactories.VALID_AGG_NAME; +import static org.opensearch.timeseries.constant.CommonMessages.FAIL_TO_FIND_CONFIG_MSG; +import static org.opensearch.timeseries.settings.TimeSeriesSettings.MAX_BATCH_TASK_PIECE_SIZE; import java.io.IOException; import java.time.Instant; @@ -44,14 +38,7 @@ import org.opensearch.action.get.GetRequest; import org.opensearch.action.get.GetResponse; import org.opensearch.action.search.SearchResponse; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.constant.CommonName; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.Feature; -import org.opensearch.ad.model.FeatureData; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.ad.transport.GetAnomalyDetectorResponse; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; @@ -82,6 +69,15 @@ import org.opensearch.search.aggregations.bucket.range.DateRangeAggregationBuilder; import org.opensearch.search.aggregations.metrics.Max; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.FeatureData; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; import com.google.common.collect.ImmutableList; @@ -336,18 +332,18 @@ public static SearchSourceBuilder generateInternalFeatureQuery( } public static SearchSourceBuilder generatePreviewQuery( - AnomalyDetector detector, + Config config, List> ranges, NamedXContentRegistry xContentRegistry ) throws IOException { - DateRangeAggregationBuilder dateRangeBuilder = dateRange("date_range").field(detector.getTimeField()).format("epoch_millis"); + DateRangeAggregationBuilder dateRangeBuilder = dateRange("date_range").field(config.getTimeField()).format("epoch_millis"); for (Entry range : ranges) { dateRangeBuilder.addRange(range.getKey(), range.getValue()); } - if (detector.getFeatureAttributes() != null) { - for (Feature feature : detector.getFeatureAttributes()) { + if (config.getFeatureAttributes() != null) { + for (Feature feature : config.getFeatureAttributes()) { AggregatorFactories.Builder internalAgg = parseAggregators( feature.getAggregation().toString(), xContentRegistry, @@ -357,52 +353,31 @@ public static SearchSourceBuilder generatePreviewQuery( } } - return new SearchSourceBuilder().query(detector.getFilterQuery()).size(0).aggregation(dateRangeBuilder); + return new SearchSourceBuilder().query(config.getFilterQuery()).size(0).aggregation(dateRangeBuilder); } - public static String generateInternalFeatureQueryTemplate(AnomalyDetector detector, NamedXContentRegistry xContentRegistry) - throws IOException { - RangeQueryBuilder rangeQuery = new RangeQueryBuilder(detector.getTimeField()) - .from("{{" + QUERY_PARAM_PERIOD_START + "}}") - .to("{{" + QUERY_PARAM_PERIOD_END + "}}"); - - BoolQueryBuilder internalFilterQuery = QueryBuilders.boolQuery().must(rangeQuery).must(detector.getFilterQuery()); - - SearchSourceBuilder internalSearchSourceBuilder = new SearchSourceBuilder().query(internalFilterQuery); - if (detector.getFeatureAttributes() != null) { - for (Feature feature : detector.getFeatureAttributes()) { - AggregatorFactories.Builder internalAgg = parseAggregators( - feature.getAggregation().toString(), - xContentRegistry, - feature.getId() - ); - internalSearchSourceBuilder.aggregation(internalAgg.getAggregatorFactories().iterator().next()); - } - } - - return internalSearchSourceBuilder.toString(); - } - - public static SearchSourceBuilder generateEntityColdStartQuery( - AnomalyDetector detector, + public static SearchSourceBuilder generateColdStartQuery( + Config config, List> ranges, - Entity entity, + Optional entity, NamedXContentRegistry xContentRegistry ) throws IOException { - BoolQueryBuilder internalFilterQuery = QueryBuilders.boolQuery().filter(detector.getFilterQuery()); + BoolQueryBuilder internalFilterQuery = QueryBuilders.boolQuery().filter(config.getFilterQuery()); - for (TermQueryBuilder term : entity.getTermQueryBuilders()) { - internalFilterQuery.filter(term); + if (entity.isPresent()) { + for (TermQueryBuilder term : entity.get().getTermQueryBuilders()) { + internalFilterQuery.filter(term); + } } - DateRangeAggregationBuilder dateRangeBuilder = dateRange("date_range").field(detector.getTimeField()).format("epoch_millis"); + DateRangeAggregationBuilder dateRangeBuilder = dateRange("date_range").field(config.getTimeField()).format("epoch_millis"); for (Entry range : ranges) { dateRangeBuilder.addRange(range.getKey(), range.getValue()); } - if (detector.getFeatureAttributes() != null) { - for (Feature feature : detector.getFeatureAttributes()) { + if (config.getFeatureAttributes() != null) { + for (Feature feature : config.getFeatureAttributes()) { AggregatorFactories.Builder internalAgg = parseAggregators( feature.getAggregation().toString(), xContentRegistry, @@ -450,7 +425,7 @@ public static SearchSourceBuilder addUserBackendRolesFilter(User user, SearchSou } else if (query instanceof BoolQueryBuilder) { ((BoolQueryBuilder) query).filter(boolQueryBuilder); } else { - throw new AnomalyDetectionException("Search API does not support queries other than BoolQuery"); + throw new TimeSeriesException("Search API does not support queries other than BoolQuery"); } return searchSourceBuilder; } @@ -469,23 +444,34 @@ public static User getUserContext(Client client) { return User.parse(userStr); } - public static void resolveUserAndExecute( + public static void resolveUserAndExecute( User requestedUser, - String detectorId, + String configId, boolean filterByEnabled, ActionListener listener, - Consumer function, + Consumer function, Client client, ClusterService clusterService, - NamedXContentRegistry xContentRegistry + NamedXContentRegistry xContentRegistry, + Class configTypeClass ) { try { - if (requestedUser == null || detectorId == null) { + if (requestedUser == null || configId == null) { // requestedUser == null means security is disabled or user is superadmin. In this case we don't need to // check if request user have access to the detector or not. function.accept(null); } else { - getDetector(requestedUser, detectorId, listener, function, client, clusterService, xContentRegistry, filterByEnabled); + getConfig( + requestedUser, + configId, + listener, + function, + client, + clusterService, + xContentRegistry, + filterByEnabled, + configTypeClass + ); } } catch (Exception e) { listener.onFailure(e); @@ -493,87 +479,115 @@ public static void resolveUserAndExecute( } /** - * If filterByEnabled is true, get detector and check if the user has permissions to access the detector, - * then execute function; otherwise, get detector and execute function + * If filterByEnabled is true, get config and check if the user has permissions to access the config, + * then execute function; otherwise, get config and execute function * @param requestUser user from request - * @param detectorId detector id + * @param configId config id * @param listener action listener * @param function consumer function * @param client client * @param clusterService cluster service * @param xContentRegistry XContent registry * @param filterByBackendRole filter by backend role or not + * @param configTypeClass the class of the ConfigType, used by the ConfigFactory to parse the correct type of Config */ - public static void getDetector( + public static void getConfig( User requestUser, - String detectorId, + String configId, ActionListener listener, - Consumer function, + Consumer function, Client client, ClusterService clusterService, NamedXContentRegistry xContentRegistry, - boolean filterByBackendRole + boolean filterByBackendRole, + Class configTypeClass ) { - if (clusterService.state().metadata().indices().containsKey(AnomalyDetector.ANOMALY_DETECTORS_INDEX)) { - GetRequest request = new GetRequest(AnomalyDetector.ANOMALY_DETECTORS_INDEX).id(detectorId); + if (clusterService.state().metadata().indices().containsKey(CommonName.CONFIG_INDEX)) { + GetRequest request = new GetRequest(CommonName.CONFIG_INDEX).id(configId); client .get( request, ActionListener .wrap( - response -> onGetAdResponse( + response -> onGetConfigResponse( response, requestUser, - detectorId, + configId, listener, function, xContentRegistry, - filterByBackendRole + filterByBackendRole, + configTypeClass ), exception -> { - logger.error("Failed to get anomaly detector: " + detectorId, exception); + logger.error("Failed to get anomaly detector: " + configId, exception); listener.onFailure(exception); } ) ); } else { - listener.onFailure(new IndexNotFoundException(AnomalyDetector.ANOMALY_DETECTORS_INDEX)); + listener.onFailure(new IndexNotFoundException(CommonName.CONFIG_INDEX)); } } - public static void onGetAdResponse( + /** + * Processes a GetResponse by leveraging the factory method Config.parseConfig to + * appropriately parse the specified type of Config. The execution of the provided + * consumer function depends on the state of the 'filterByBackendRole' setting: + * + * - If 'filterByBackendRole' is disabled, the consumer function will be invoked + * irrespective of the user's permissions. + * + * - If 'filterByBackendRole' is enabled, the consumer function will only be invoked + * provided the user holds the requisite permissions. + * + * @param The type of Config to be processed in this method, which extends from the Config base type. + * @param response The GetResponse from the getConfig request. This contains the information about the config that is to be processed. + * @param requestUser The User from the request. This user's permissions will be checked to ensure they have access to the config. + * @param configId The ID of the config. This is used for logging and error messages. + * @param listener The ActionListener to call if an error occurs. Any errors that occur during the processing of the config will be passed to this listener. + * @param function The Consumer function to apply to the ConfigType. If the user has permission to access the config, this function will be applied. + * @param xContentRegistry The XContentRegistry used to create the XContentParser. This is used to parse the response into a ConfigType. + * @param filterByBackendRole A boolean indicating whether to filter by backend role. If true, the user's backend roles will be checked to ensure they have access to the config. + * @param configTypeClass The class of the ConfigType, used by the ConfigFactory to parse the correct type of Config. + */ + public static void onGetConfigResponse( GetResponse response, User requestUser, - String detectorId, + String configId, ActionListener listener, - Consumer function, + Consumer function, NamedXContentRegistry xContentRegistry, - boolean filterByBackendRole + boolean filterByBackendRole, + Class configTypeClass ) { if (response.isExists()) { try ( XContentParser parser = RestHandlerUtils.createXContentParserFromRegistry(xContentRegistry, response.getSourceAsBytesRef()) ) { ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser); - AnomalyDetector detector = AnomalyDetector.parse(parser); - User resourceUser = detector.getUser(); + @SuppressWarnings("unchecked") + ConfigType config = (ConfigType) Config.parseConfig(configTypeClass, parser); + User resourceUser = config.getUser(); - if (!filterByBackendRole || checkUserPermissions(requestUser, resourceUser, detectorId) || isAdmin(requestUser)) { - function.accept(detector); + if (!filterByBackendRole || checkUserPermissions(requestUser, resourceUser, configId) || isAdmin(requestUser)) { + function.accept(config); } else { - logger.debug("User: " + requestUser.getName() + " does not have permissions to access detector: " + detectorId); - listener.onFailure(new AnomalyDetectionException(NO_PERMISSION_TO_ACCESS_DETECTOR + detectorId)); + logger.debug("User: " + requestUser.getName() + " does not have permissions to access config: " + configId); + listener.onFailure(new TimeSeriesException(CommonMessages.NO_PERMISSION_TO_ACCESS_CONFIG + configId)); } } catch (Exception e) { - listener.onFailure(new AnomalyDetectionException(FAIL_TO_GET_USER_INFO + detectorId)); + listener.onFailure(new TimeSeriesException(CommonMessages.FAIL_TO_GET_USER_INFO + configId)); } } else { - listener.onFailure(new ResourceNotFoundException(detectorId, FAIL_TO_FIND_DETECTOR_MSG + detectorId)); + listener.onFailure(new ResourceNotFoundException(configId, FAIL_TO_FIND_CONFIG_MSG + configId)); } } /** - * 'all_access' role users are treated as admins. + * 'all_access' role users are treated as admins. + * @param user of the current role + * @return boolean if the role is admin */ public static boolean isAdmin(User user) { if (user == null) { @@ -611,7 +625,7 @@ public static boolean checkFilterByBackendRoles(User requestedUser, ActionListen if (requestedUser.getBackendRoles().isEmpty()) { listener .onFailure( - new AnomalyDetectionException( + new TimeSeriesException( "Filter by backend roles is enabled and User " + requestedUser.getName() + " does not have backend roles configured" ) ); @@ -644,7 +658,7 @@ public static Optional getLatestDataTime(SearchResponse searchResponse) { * @param xContentRegistry content registry * @return search source builder * @throws IOException throw IO exception if fail to parse feature aggregation - * @throws AnomalyDetectionException throw AD exception if no enabled feature + * @throws TimeSeriesException throw AD exception if no enabled feature */ public static SearchSourceBuilder batchFeatureQuery( AnomalyDetector detector, @@ -662,28 +676,28 @@ public static SearchSourceBuilder batchFeatureQuery( BoolQueryBuilder internalFilterQuery = QueryBuilders.boolQuery().must(rangeQuery).must(detector.getFilterQuery()); - if (detector.isMultientityDetector() && entity != null && entity.getAttributes().size() > 0) { + if (detector.isHighCardinality() && entity != null && entity.getAttributes().size() > 0) { entity .getAttributes() .entrySet() .forEach(attr -> { internalFilterQuery.filter(new TermQueryBuilder(attr.getKey(), attr.getValue())); }); } - long intervalSeconds = ((IntervalTimeConfiguration) detector.getDetectionInterval()).toDuration().getSeconds(); + long intervalSeconds = ((IntervalTimeConfiguration) detector.getInterval()).toDuration().getSeconds(); List> sources = new ArrayList<>(); sources .add( - new DateHistogramValuesSourceBuilder(DATE_HISTOGRAM) + new DateHistogramValuesSourceBuilder(CommonName.DATE_HISTOGRAM) .field(detector.getTimeField()) .fixedInterval(DateHistogramInterval.seconds((int) intervalSeconds)) ); - CompositeAggregationBuilder aggregationBuilder = new CompositeAggregationBuilder(FEATURE_AGGS, sources) + CompositeAggregationBuilder aggregationBuilder = new CompositeAggregationBuilder(CommonName.FEATURE_AGGS, sources) .size(MAX_BATCH_TASK_PIECE_SIZE); if (detector.getEnabledFeatureIds().size() == 0) { - throw new AnomalyDetectionException("No enabled feature configured").countedInStats(false); + throw new TimeSeriesException("No enabled feature configured").countedInStats(false); } for (Feature feature : detector.getFeatureAttributes()) { diff --git a/src/main/java/org/opensearch/ad/util/RestHandlerUtils.java b/src/main/java/org/opensearch/timeseries/util/RestHandlerUtils.java similarity index 87% rename from src/main/java/org/opensearch/ad/util/RestHandlerUtils.java rename to src/main/java/org/opensearch/timeseries/util/RestHandlerUtils.java index f89ed2330..45e318aa2 100644 --- a/src/main/java/org/opensearch/ad/util/RestHandlerUtils.java +++ b/src/main/java/org/opensearch/timeseries/util/RestHandlerUtils.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.util; +package org.opensearch.timeseries.util; import static org.opensearch.core.rest.RestStatus.BAD_REQUEST; import static org.opensearch.core.rest.RestStatus.INTERNAL_SERVER_ERROR; @@ -25,11 +25,6 @@ import org.opensearch.OpenSearchStatusException; import org.opensearch.action.search.SearchPhaseExecutionException; import org.opensearch.action.search.ShardSearchFailure; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Feature; import org.opensearch.common.Nullable; import org.opensearch.common.xcontent.LoggingDeprecationHandler; import org.opensearch.common.xcontent.XContentHelper; @@ -47,6 +42,11 @@ import org.opensearch.rest.RestRequest; import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.search.fetch.subphase.FetchSourceContext; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Feature; import com.google.common.base.Throwables; import com.google.common.collect.ImmutableMap; @@ -84,7 +84,11 @@ public final class RestHandlerUtils { public static final ToXContent.MapParams XCONTENT_WITH_TYPE = new ToXContent.MapParams(ImmutableMap.of("with_type", "true")); public static final String OPENSEARCH_DASHBOARDS_USER_AGENT = "OpenSearch Dashboards"; - public static final String[] UI_METADATA_EXCLUDE = new String[] { AnomalyDetector.UI_METADATA_FIELD }; + public static final String[] UI_METADATA_EXCLUDE = new String[] { Config.UI_METADATA_FIELD }; + + public static final String FORECASTER_ID = "forecasterID"; + public static final String FORECASTER = "forecaster"; + public static final String REST_STATUS = "rest_status"; private RestHandlerUtils() {} @@ -130,18 +134,18 @@ public static XContentParser createXContentParserFromRegistry(NamedXContentRegis } /** - * Check if there is configuration/syntax error in feature definition of anomalyDetector - * @param anomalyDetector detector to check - * @param maxAnomalyFeatures max allowed feature number + * Check if there is configuration/syntax error in feature definition of config + * @param config config to check + * @param maxFeatures max allowed feature number * @return error message if error exists; otherwise, null is returned */ - public static String checkAnomalyDetectorFeaturesSyntax(AnomalyDetector anomalyDetector, int maxAnomalyFeatures) { - List features = anomalyDetector.getFeatureAttributes(); + public static String checkFeaturesSyntax(Config config, int maxFeatures) { + List features = config.getFeatureAttributes(); if (features != null) { - if (features.size() > maxAnomalyFeatures) { - return "Can't create more than " + maxAnomalyFeatures + " anomaly features"; + if (features.size() > maxFeatures) { + return "Can't create more than " + maxFeatures + " features"; } - return validateFeaturesConfig(anomalyDetector.getFeatureAttributes()); + return validateFeaturesConfig(config.getFeatureAttributes()); } return null; } @@ -163,14 +167,14 @@ private static String validateFeaturesConfig(List features) { StringBuilder errorMsgBuilder = new StringBuilder(); if (duplicateFeatureNames.size() > 0) { - errorMsgBuilder.append("Detector has duplicate feature names: "); + errorMsgBuilder.append("There are duplicate feature names: "); errorMsgBuilder.append(String.join(", ", duplicateFeatureNames)); } if (errorMsgBuilder.length() != 0 && duplicateFeatureAggNames.size() > 0) { errorMsgBuilder.append(". "); } if (duplicateFeatureAggNames.size() > 0) { - errorMsgBuilder.append(CommonErrorMessages.DUPLICATE_FEATURE_AGGREGATION_NAMES); + errorMsgBuilder.append(CommonMessages.DUPLICATE_FEATURE_AGGREGATION_NAMES); errorMsgBuilder.append(String.join(", ", duplicateFeatureAggNames)); } return errorMsgBuilder.toString(); @@ -193,9 +197,9 @@ public static boolean isExceptionCausedByInvalidQuery(Exception ex) { /** * Wrap action listener to avoid return verbose error message and wrong 500 error to user. - * Suggestion for exception handling in AD: + * Suggestion for exception handling in timeseries analysis (e.g., AD and Forecast): * 1. If the error is caused by wrong input, throw IllegalArgumentException exception. - * 2. For other errors, please use AnomalyDetectionException or its subclass, or use + * 2. For other errors, please use TimeSeriesException or its subclass, or use * OpenSearchStatusException. * * TODO: tune this function for wrapped exception, return root exception error message @@ -216,9 +220,9 @@ public static ActionListener wrapRestActionListener(ActionListener action } else { RestStatus status = isBadRequest(e) ? BAD_REQUEST : INTERNAL_SERVER_ERROR; String errorMessage = generalErrorMessage; - if (isBadRequest(e) || e instanceof AnomalyDetectionException) { + if (isBadRequest(e) || e instanceof TimeSeriesException) { errorMessage = e.getMessage(); - } else if (cause != null && (isBadRequest(cause) || cause instanceof AnomalyDetectionException)) { + } else if (cause != null && (isBadRequest(cause) || cause instanceof TimeSeriesException)) { errorMessage = cause.getMessage(); } actionListener.onFailure(new OpenSearchStatusException(errorMessage, status)); diff --git a/src/main/java/org/opensearch/timeseries/util/SafeSecurityInjector.java b/src/main/java/org/opensearch/timeseries/util/SafeSecurityInjector.java new file mode 100644 index 000000000..671aa0466 --- /dev/null +++ b/src/main/java/org/opensearch/timeseries/util/SafeSecurityInjector.java @@ -0,0 +1,87 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.timeseries.util; + +import java.util.List; +import java.util.Locale; + +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; +import org.apache.logging.log4j.message.ParameterizedMessage; +import org.opensearch.common.settings.Settings; +import org.opensearch.common.util.concurrent.ThreadContext; +import org.opensearch.commons.ConfigConstants; +import org.opensearch.commons.InjectSecurity; + +public abstract class SafeSecurityInjector implements AutoCloseable { + private static final Logger LOG = LogManager.getLogger(SafeSecurityInjector.class); + // user header used by security plugin. As we cannot take security plugin as + // a compile dependency, we have to duplicate it here. + private static final String OPENDISTRO_SECURITY_USER = "_opendistro_security_user"; + + private InjectSecurity rolesInjectorHelper; + protected String id; + protected Settings settings; + protected ThreadContext tc; + + public SafeSecurityInjector(String id, Settings settings, ThreadContext tc) { + this.id = id; + this.settings = settings; + this.tc = tc; + this.rolesInjectorHelper = null; + } + + protected boolean shouldInject() { + if (id == null || settings == null || tc == null) { + LOG.debug(String.format(Locale.ROOT, "null value: id: %s, settings: %s, threadContext: %s", id, settings, tc)); + return false; + } + // user not null means the request comes from user (e.g., public restful API) + // we don't need to inject roles. + Object userIn = tc.getTransient(OPENDISTRO_SECURITY_USER); + if (userIn != null) { + LOG.debug(new ParameterizedMessage("User not empty in thread context: [{}]", userIn)); + return false; + } + userIn = tc.getTransient(ConfigConstants.OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT); + if (userIn != null) { + LOG.debug(new ParameterizedMessage("User not empty in thread context: [{}]", userIn)); + return false; + } + Object rolesin = tc.getTransient(ConfigConstants.OPENSEARCH_SECURITY_INJECTED_ROLES); + if (rolesin != null) { + LOG.warn(new ParameterizedMessage("Injected roles not empty in thread context: [{}]", rolesin)); + return false; + } + + return true; + } + + protected void inject(String user, List roles) { + if (roles == null) { + LOG.warn("Cannot inject empty roles in thread context"); + return; + } + if (rolesInjectorHelper == null) { + // lazy init + rolesInjectorHelper = new InjectSecurity(id, settings, tc); + } + rolesInjectorHelper.inject(user, roles); + } + + @Override + public void close() { + if (rolesInjectorHelper != null) { + rolesInjectorHelper.close(); + } + } +} diff --git a/src/main/java/org/opensearch/ad/util/SecurityClientUtil.java b/src/main/java/org/opensearch/timeseries/util/SecurityClientUtil.java similarity index 82% rename from src/main/java/org/opensearch/ad/util/SecurityClientUtil.java rename to src/main/java/org/opensearch/timeseries/util/SecurityClientUtil.java index ac5625c35..7e7321161 100644 --- a/src/main/java/org/opensearch/ad/util/SecurityClientUtil.java +++ b/src/main/java/org/opensearch/timeseries/util/SecurityClientUtil.java @@ -9,13 +9,12 @@ * GitHub history for details. */ -package org.opensearch.ad.util; +package org.opensearch.timeseries.util; import java.util.function.BiConsumer; import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionType; -import org.opensearch.ad.NodeStateManager; import org.opensearch.client.Client; import org.opensearch.common.inject.Inject; import org.opensearch.common.settings.Settings; @@ -23,6 +22,8 @@ import org.opensearch.commons.authuser.User; import org.opensearch.core.action.ActionListener; import org.opensearch.core.action.ActionResponse; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; public class SecurityClientUtil { private static final String INJECTION_ID = "direct"; @@ -51,12 +52,21 @@ public void asy BiConsumer> consumer, String detectorId, Client client, + AnalysisType context, ActionListener listener ) { ThreadContext threadContext = client.threadPool().getThreadContext(); - try (ADSafeSecurityInjector injectSecurity = new ADSafeSecurityInjector(detectorId, settings, threadContext, nodeStateManager)) { + try ( + TimeSeriesSafeSecurityInjector injectSecurity = new TimeSeriesSafeSecurityInjector( + detectorId, + settings, + threadContext, + nodeStateManager, + context + ) + ) { injectSecurity - .injectUserRolesFromDetector( + .injectUserRolesFromConfig( ActionListener .wrap( success -> consumer.accept(request, ActionListener.runBefore(listener, () -> injectSecurity.close())), @@ -82,6 +92,7 @@ public void asy BiConsumer> consumer, User user, Client client, + AnalysisType context, ActionListener listener ) { ThreadContext threadContext = client.threadPool().getThreadContext(); @@ -95,7 +106,15 @@ public void asy // client.execute/client.search and handles the responses (this can be a thread in the search thread pool). // Auto-close in try will restore the context in one thread; the explicit close injectSecurity will restore // the context in another thread. So we still need to put the injectSecurity inside try. - try (ADSafeSecurityInjector injectSecurity = new ADSafeSecurityInjector(INJECTION_ID, settings, threadContext, nodeStateManager)) { + try ( + TimeSeriesSafeSecurityInjector injectSecurity = new TimeSeriesSafeSecurityInjector( + INJECTION_ID, + settings, + threadContext, + nodeStateManager, + context + ) + ) { injectSecurity.injectUserRoles(user); consumer.accept(request, ActionListener.runBefore(listener, () -> injectSecurity.close())); } @@ -117,12 +136,21 @@ public void exe Request request, User user, Client client, + AnalysisType context, ActionListener listener ) { ThreadContext threadContext = client.threadPool().getThreadContext(); // use a hardcoded string as detector id that is only used in logging - try (ADSafeSecurityInjector injectSecurity = new ADSafeSecurityInjector(INJECTION_ID, settings, threadContext, nodeStateManager)) { + try ( + TimeSeriesSafeSecurityInjector injectSecurity = new TimeSeriesSafeSecurityInjector( + INJECTION_ID, + settings, + threadContext, + nodeStateManager, + context + ) + ) { injectSecurity.injectUserRoles(user); client.execute(action, request, ActionListener.runBefore(listener, () -> injectSecurity.close())); } diff --git a/src/main/java/org/opensearch/ad/util/SecurityUtil.java b/src/main/java/org/opensearch/timeseries/util/SecurityUtil.java similarity index 86% rename from src/main/java/org/opensearch/ad/util/SecurityUtil.java rename to src/main/java/org/opensearch/timeseries/util/SecurityUtil.java index d72d345ab..135116c34 100644 --- a/src/main/java/org/opensearch/ad/util/SecurityUtil.java +++ b/src/main/java/org/opensearch/timeseries/util/SecurityUtil.java @@ -9,15 +9,15 @@ * GitHub history for details. */ -package org.opensearch.ad.util; +package org.opensearch.timeseries.util; import java.util.Collections; import java.util.List; -import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.common.settings.Settings; import org.opensearch.commons.authuser.User; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Job; import com.google.common.collect.ImmutableList; @@ -57,12 +57,12 @@ private static User getAdjustedUserBWC(User userObj, Settings settings) { /** * * - * @param detector Detector config + * @param config analysis config * @param settings Node settings * @return user recorded by a detector. Made adjstument for BWC (backward-compatibility) if necessary. */ - public static User getUserFromDetector(AnomalyDetector detector, Settings settings) { - return getAdjustedUserBWC(detector.getUser(), settings); + public static User getUserFromConfig(Config config, Settings settings) { + return getAdjustedUserBWC(config.getUser(), settings); } /** @@ -71,7 +71,7 @@ public static User getUserFromDetector(AnomalyDetector detector, Settings settin * @param settings Node settings * @return user recorded by a detector job */ - public static User getUserFromJob(AnomalyDetectorJob detectorJob, Settings settings) { + public static User getUserFromJob(Job detectorJob, Settings settings) { return getAdjustedUserBWC(detectorJob.getUser(), settings); } } diff --git a/src/main/java/org/opensearch/ad/util/ADSafeSecurityInjector.java b/src/main/java/org/opensearch/timeseries/util/TimeSeriesSafeSecurityInjector.java similarity index 52% rename from src/main/java/org/opensearch/ad/util/ADSafeSecurityInjector.java rename to src/main/java/org/opensearch/timeseries/util/TimeSeriesSafeSecurityInjector.java index 3f8c7c3ae..0d5e53aef 100644 --- a/src/main/java/org/opensearch/ad/util/ADSafeSecurityInjector.java +++ b/src/main/java/org/opensearch/timeseries/util/TimeSeriesSafeSecurityInjector.java @@ -9,31 +9,40 @@ * GitHub history for details. */ -package org.opensearch.ad.util; +package org.opensearch.timeseries.util; import java.util.Optional; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.common.settings.Settings; import org.opensearch.common.util.concurrent.ThreadContext; import org.opensearch.commons.authuser.User; import org.opensearch.core.action.ActionListener; import org.opensearch.core.common.Strings; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.model.Config; -public class ADSafeSecurityInjector extends SafeSecurityInjector { - private static final Logger LOG = LogManager.getLogger(ADSafeSecurityInjector.class); +public class TimeSeriesSafeSecurityInjector extends SafeSecurityInjector { + private static final Logger LOG = LogManager.getLogger(TimeSeriesSafeSecurityInjector.class); private NodeStateManager nodeStateManager; + private AnalysisType context; - public ADSafeSecurityInjector(String detectorId, Settings settings, ThreadContext tc, NodeStateManager stateManager) { - super(detectorId, settings, tc); + public TimeSeriesSafeSecurityInjector( + String configId, + Settings settings, + ThreadContext tc, + NodeStateManager stateManager, + AnalysisType context + ) { + super(configId, settings, tc); this.nodeStateManager = stateManager; + this.context = context; } - public void injectUserRolesFromDetector(ActionListener injectListener) { + public void injectUserRolesFromConfig(ActionListener injectListener) { // if id is null, we cannot fetch a detector if (Strings.isEmpty(id)) { LOG.debug("Empty id"); @@ -48,21 +57,21 @@ public void injectUserRolesFromDetector(ActionListener injectListener) { return; } - ActionListener> getDetectorListener = ActionListener.wrap(detectorOp -> { - if (!detectorOp.isPresent()) { - injectListener.onFailure(new EndRunException(id, "AnomalyDetector is not available.", false)); + ActionListener> getConfigListener = ActionListener.wrap(configOp -> { + if (!configOp.isPresent()) { + injectListener.onFailure(new EndRunException(id, "Config is not available.", false)); return; } - AnomalyDetector detector = detectorOp.get(); - User userInfo = SecurityUtil.getUserFromDetector(detector, settings); + Config config = configOp.get(); + User userInfo = SecurityUtil.getUserFromConfig(config, settings); inject(userInfo.getName(), userInfo.getRoles()); injectListener.onResponse(null); }, injectListener::onFailure); - // Since we are gonna read user from detector, make sure the anomaly detector exists and fetched from disk or cached memory - // We don't accept a passed-in AnomalyDetector because the caller might mistakenly not insert any user info in the - // constructed AnomalyDetector and thus poses risks. In the case, if the user is null, we will give admin role. - nodeStateManager.getAnomalyDetector(id, getDetectorListener); + // Since we are gonna read user from config, make sure the config exists and fetched from disk or cached memory + // We don't accept a passed-in Config because the caller might mistakenly not insert any user info in the + // constructed Config and thus poses risks. In the case, if the user is null, we will give admin role. + nodeStateManager.getConfig(id, context, getConfigListener); } public void injectUserRoles(User user) { diff --git a/src/main/resources/META-INF/services/org.opensearch.jobscheduler.spi.JobSchedulerExtension b/src/main/resources/META-INF/services/org.opensearch.jobscheduler.spi.JobSchedulerExtension index 627699843..01c2dfbe9 100644 --- a/src/main/resources/META-INF/services/org.opensearch.jobscheduler.spi.JobSchedulerExtension +++ b/src/main/resources/META-INF/services/org.opensearch.jobscheduler.spi.JobSchedulerExtension @@ -1 +1 @@ -org.opensearch.ad.AnomalyDetectorPlugin +org.opensearch.timeseries.TimeSeriesAnalyticsPlugin diff --git a/src/main/resources/es-plugin.properties b/src/main/resources/es-plugin.properties index a8dc4e91e..061a6f07d 100644 --- a/src/main/resources/es-plugin.properties +++ b/src/main/resources/es-plugin.properties @@ -9,5 +9,5 @@ # GitHub history for details. # -plugin=org.opensearch.ad.AnomalyDetectorPlugin +plugin=org.opensearch.timeseries.TimeSeriesAnalyticsPlugin version=${project.version} \ No newline at end of file diff --git a/src/main/resources/mappings/checkpoint.json b/src/main/resources/mappings/anomaly-checkpoint.json similarity index 100% rename from src/main/resources/mappings/checkpoint.json rename to src/main/resources/mappings/anomaly-checkpoint.json diff --git a/src/main/resources/mappings/anomaly-detectors.json b/src/main/resources/mappings/config.json similarity index 100% rename from src/main/resources/mappings/anomaly-detectors.json rename to src/main/resources/mappings/config.json diff --git a/src/main/resources/mappings/forecast-checkpoint.json b/src/main/resources/mappings/forecast-checkpoint.json new file mode 100644 index 000000000..e3337c643 --- /dev/null +++ b/src/main/resources/mappings/forecast-checkpoint.json @@ -0,0 +1,64 @@ +{ + "dynamic": true, + "_meta": { + "schema_version": 1 + }, + "properties": { + "forecaster_id": { + "type": "keyword" + }, + "timestamp": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "schema_version": { + "type": "integer" + }, + "entity": { + "type": "nested", + "properties": { + "name": { + "type": "keyword" + }, + "value": { + "type": "keyword" + } + } + }, + "model": { + "type": "binary" + }, + "samples": { + "type": "nested", + "properties": { + "value_list": { + "type": "double" + }, + "data_start_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "data_end_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + } + } + }, + "last_processed_sample": { + "type": "nested", + "properties": { + "value_list": { + "type": "double" + }, + "data_start_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "data_end_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + } + } + } + } +} diff --git a/src/main/resources/mappings/forecast-results.json b/src/main/resources/mappings/forecast-results.json new file mode 100644 index 000000000..745d308ad --- /dev/null +++ b/src/main/resources/mappings/forecast-results.json @@ -0,0 +1,131 @@ +{ + "dynamic": true, + "_meta": { + "schema_version": 1 + }, + "properties": { + "forecaster_id": { + "type": "keyword" + }, + "feature_data": { + "type": "nested", + "properties": { + "feature_id": { + "type": "keyword" + }, + "data": { + "type": "double" + } + } + }, + "data_quality": { + "type": "double" + }, + "data_start_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "data_end_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "execution_start_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "execution_end_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "error": { + "type": "text" + }, + "user": { + "type": "nested", + "properties": { + "name": { + "type": "text", + "fields": { + "keyword": { + "type": "keyword", + "ignore_above": 256 + } + } + }, + "backend_roles": { + "type": "text", + "fields": { + "keyword": { + "type": "keyword" + } + } + }, + "roles": { + "type": "text", + "fields": { + "keyword": { + "type": "keyword" + } + } + }, + "custom_attribute_names": { + "type": "text", + "fields": { + "keyword": { + "type": "keyword" + } + } + } + } + }, + "entity": { + "type": "nested", + "properties": { + "name": { + "type": "keyword" + }, + "value": { + "type": "keyword" + } + } + }, + "schema_version": { + "type": "integer" + }, + "task_id": { + "type": "keyword" + }, + "model_id": { + "type": "keyword" + }, + "entity_id": { + "type": "keyword" + }, + "forecast_lower_bound": { + "type": "double" + }, + "forecast_upper_bound": { + "type": "double" + }, + "confidence_interval_width": { + "type": "double" + }, + "forecast_value": { + "type": "double" + }, + "horizon_index": { + "type": "integer" + }, + "forecast_data_start_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "forecast_data_end_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "feature_id": { + "type": "keyword" + } + } +} diff --git a/src/main/resources/mappings/forecast-state.json b/src/main/resources/mappings/forecast-state.json new file mode 100644 index 000000000..59c95a76d --- /dev/null +++ b/src/main/resources/mappings/forecast-state.json @@ -0,0 +1,133 @@ +{ + "dynamic": false, + "_meta": { + "schema_version": 1 + }, + "properties": { + "schema_version": { + "type": "integer" + }, + "last_update_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "error": { + "type": "text" + }, + "started_by": { + "type": "keyword" + }, + "stopped_by": { + "type": "keyword" + }, + "forecaster_id": { + "type": "keyword" + }, + "state": { + "type": "keyword" + }, + "task_progress": { + "type": "float" + }, + "init_progress": { + "type": "float" + }, + "current_piece": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "execution_start_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "execution_end_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "is_latest": { + "type": "boolean" + }, + "task_type": { + "type": "keyword" + }, + "checkpoint_id": { + "type": "keyword" + }, + "coordinating_node": { + "type": "keyword" + }, + "worker_node": { + "type": "keyword" + }, + "user": { + "type": "nested", + "properties": { + "name": { + "type": "text", + "fields": { + "keyword": { + "type": "keyword", + "ignore_above": 256 + } + } + }, + "backend_roles": { + "type" : "text", + "fields" : { + "keyword" : { + "type" : "keyword" + } + } + }, + "roles": { + "type" : "text", + "fields" : { + "keyword" : { + "type" : "keyword" + } + } + }, + "custom_attribute_names": { + "type" : "text", + "fields" : { + "keyword" : { + "type" : "keyword" + } + } + } + } + }, + "forecaster": { + FORECASTER_INDEX_MAPPING_PLACE_HOLDER + }, + "date_range": { + "properties": { + "start_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + }, + "end_time": { + "type": "date", + "format": "strict_date_time||epoch_millis" + } + } + }, + "parent_task_id": { + "type": "keyword" + }, + "entity": { + "type": "nested", + "properties": { + "name": { + "type": "keyword" + }, + "value": { + "type": "keyword" + } + } + }, + "estimated_minutes_left": { + "type": "integer" + } + } +} diff --git a/src/main/resources/mappings/anomaly-detector-jobs.json b/src/main/resources/mappings/job.json similarity index 100% rename from src/main/resources/mappings/anomaly-detector-jobs.json rename to src/main/resources/mappings/job.json diff --git a/src/test/java/org/opensearch/BwcTests.java b/src/test/java/org/opensearch/BwcTests.java deleted file mode 100644 index 692d52653..000000000 --- a/src/test/java/org/opensearch/BwcTests.java +++ /dev/null @@ -1,564 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch; - -import static java.util.Collections.emptyMap; -import static java.util.Collections.emptySet; -import static org.hamcrest.Matchers.equalTo; -import static org.opensearch.test.OpenSearchTestCase.randomDouble; -import static org.opensearch.test.OpenSearchTestCase.randomDoubleBetween; -import static org.opensearch.test.OpenSearchTestCase.randomIntBetween; - -import java.io.IOException; -import java.util.ArrayList; -import java.util.Arrays; -import java.util.Collections; -import java.util.Comparator; -import java.util.HashMap; -import java.util.HashSet; -import java.util.List; -import java.util.Map; -import java.util.Set; - -import org.junit.BeforeClass; -import org.opensearch.action.FailedNodeException; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.EntityProfileName; -import org.opensearch.ad.model.ModelProfile; -import org.opensearch.ad.model.ModelProfileOnNode; -import org.opensearch.ad.transport.EntityProfileAction; -import org.opensearch.ad.transport.EntityProfileRequest; -import org.opensearch.ad.transport.EntityProfileResponse; -import org.opensearch.ad.transport.EntityResultRequest; -import org.opensearch.ad.transport.ProfileNodeResponse; -import org.opensearch.ad.transport.ProfileResponse; -import org.opensearch.ad.transport.RCFResultResponse; -import org.opensearch.ad.util.Bwc; -import org.opensearch.cluster.ClusterName; -import org.opensearch.cluster.node.DiscoveryNode; -import org.opensearch.common.io.stream.BytesStreamOutput; -import org.opensearch.core.common.io.stream.StreamInput; -import org.opensearch.core.common.transport.TransportAddress; - -/** - * Put in core package so that we can using Version's package private constructor - * - */ -public class BwcTests extends AbstractADTest { - public static Version V_1_1_0 = new Version(1010099, org.apache.lucene.util.Version.LUCENE_8_8_2); - private EntityResultRequest entityResultRequest1_1; - private EntityResultRequest1_0 entityResultRequest1_0; - private String detectorId; - private long start, end; - private Map entities1_1, convertedEntities1_0; - private Map entities1_0; - private BytesStreamOutput output1_1, output1_0; - private String categoryField, categoryValue, categoryValue2; - private double[] feature; - private EntityProfileRequest entityProfileRequest1_1; - private EntityProfileRequest1_0 entityProfileRequest1_0; - private Entity entity, entity2, convertedEntity; - private Set profilesToCollect; - private String nodeId = "abc"; - private String modelId = "123"; - private long modelSize = 712480L; - private long modelSize2 = 112480L; - private EntityProfileResponse entityProfileResponse1_1; - private EntityProfileResponse1_0 entityProfileResponse1_0; - private ModelProfileOnNode convertedModelProfileOnNode; - private ProfileResponse profileResponse1_1; - private ProfileResponse1_0 profileResponse1_0; - private ModelProfileOnNode[] convertedModelProfileOnNodeArray; - private ModelProfile1_0[] convertedModelProfile; - private RCFResultResponse rcfResultResponse1_1; - private RCFResultResponse1_0 rcfResultResponse1_0; - - private boolean areEqualWithArrayValue(Map first, Map second) { - if (first.size() != second.size()) { - return false; - } - - return first.entrySet().stream().allMatch(e -> Arrays.equals(e.getValue(), second.get(e.getKey()))); - } - - private boolean areEqualEntityArrayValue1_0(Map first, Map second) { - if (first.size() != second.size()) { - return false; - } - - return first.entrySet().stream().allMatch(e -> Arrays.equals(e.getValue(), second.get(e.getKey()))); - } - - @BeforeClass - public static void setUpBeforeClass() { - Bwc.DISABLE_BWC = false; - } - - @Override - public void setUp() throws Exception { - super.setUp(); - - categoryField = "a"; - categoryValue = "b"; - categoryValue2 = "b2"; - - feature = new double[] { 0.3 }; - detectorId = "123"; - - entity = Entity.createSingleAttributeEntity(categoryField, categoryValue); - entity2 = Entity.createSingleAttributeEntity(categoryField, categoryValue2); - convertedEntity = Entity.createSingleAttributeEntity(CommonName.EMPTY_FIELD, categoryValue); - - output1_1 = new BytesStreamOutput(); - output1_1.setVersion(V_1_1_0); - - output1_0 = new BytesStreamOutput(); - output1_0.setVersion(Version.V_1_0_0); - } - - private void setUpEntityResultRequest() { - entities1_1 = new HashMap<>(); - entities1_1.put(entity, feature); - start = 10L; - end = 20L; - entityResultRequest1_1 = new EntityResultRequest(detectorId, entities1_1, start, end); - - entities1_0 = new HashMap<>(); - entities1_0.put(categoryValue, feature); - entityResultRequest1_0 = new EntityResultRequest1_0(detectorId, entities1_0, start, end); - convertedEntities1_0 = new HashMap(); - convertedEntities1_0.put(convertedEntity, feature); - } - - /** - * For EntityResultRequest, the input is a 1.1 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testDeSerializeEntityResultRequest1_1() throws IOException { - setUpEntityResultRequest(); - - entityResultRequest1_1.writeTo(output1_1); - - StreamInput streamInput = output1_1.bytes().streamInput(); - streamInput.setVersion(V_1_1_0); - EntityResultRequest readRequest = new EntityResultRequest(streamInput); - assertThat(readRequest.getDetectorId(), equalTo(detectorId)); - assertThat(readRequest.getStart(), equalTo(start)); - assertThat(readRequest.getEnd(), equalTo(end)); - assertTrue(areEqualWithArrayValue(readRequest.getEntities(), entities1_1)); - } - - /** - * For EntityResultRequest, the input is a 1.0 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testDeSerializeEntityResultRequest1_0() throws IOException { - setUpEntityResultRequest(); - - entityResultRequest1_0.writeTo(output1_0); - - StreamInput streamInput = output1_0.bytes().streamInput(); - streamInput.setVersion(Version.V_1_0_0); - EntityResultRequest readRequest = new EntityResultRequest(streamInput); - assertThat(readRequest.getDetectorId(), equalTo(detectorId)); - assertThat(readRequest.getStart(), equalTo(start)); - assertThat(readRequest.getEnd(), equalTo(end)); - assertTrue(areEqualWithArrayValue(readRequest.getEntities(), convertedEntities1_0)); - } - - /** - * For EntityResultRequest, the output is a 1.0 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testSerializateEntityResultRequest1_0() throws IOException { - setUpEntityResultRequest(); - - entityResultRequest1_1.writeTo(output1_0); - - StreamInput streamInput = output1_0.bytes().streamInput(); - streamInput.setVersion(Version.V_1_0_0); - EntityResultRequest1_0 readRequest = new EntityResultRequest1_0(streamInput); - assertThat(readRequest.getDetectorId(), equalTo(detectorId)); - assertThat(readRequest.getStart(), equalTo(start)); - assertThat(readRequest.getEnd(), equalTo(end)); - assertTrue(areEqualEntityArrayValue1_0(readRequest.getEntities(), entityResultRequest1_0.getEntities())); - } - - private void setUpEntityProfileRequest() { - profilesToCollect = new HashSet(); - profilesToCollect.add(EntityProfileName.STATE); - entityProfileRequest1_1 = new EntityProfileRequest(detectorId, entity, profilesToCollect); - entityProfileRequest1_0 = new EntityProfileRequest1_0(detectorId, categoryValue, profilesToCollect); - } - - /** - * For EntityResultRequest, the input is a 1.1 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testDeserializeEntityProfileRequest1_1() throws IOException { - setUpEntityProfileRequest(); - - entityProfileRequest1_1.writeTo(output1_1); - - StreamInput streamInput = output1_1.bytes().streamInput(); - streamInput.setVersion(V_1_1_0); - EntityProfileRequest readRequest = new EntityProfileRequest(streamInput); - assertThat(readRequest.getAdID(), equalTo(detectorId)); - assertThat(readRequest.getEntityValue(), equalTo(entity)); - assertThat(readRequest.getProfilesToCollect(), equalTo(profilesToCollect)); - } - - /** - * For EntityResultRequest, the input is a 1.0 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testDeserializeEntityProfileRequest1_0() throws IOException { - setUpEntityProfileRequest(); - - entityProfileRequest1_0.writeTo(output1_0); - - StreamInput streamInput = output1_0.bytes().streamInput(); - streamInput.setVersion(Version.V_1_0_0); - EntityProfileRequest readRequest = new EntityProfileRequest(streamInput); - assertThat(readRequest.getAdID(), equalTo(detectorId)); - assertThat(readRequest.getEntityValue(), equalTo(convertedEntity)); - assertThat(readRequest.getProfilesToCollect(), equalTo(profilesToCollect)); - } - - /** - * For EntityResultRequest, the output is a 1.0 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testSerializeEntityProfileRequest1_0() throws IOException { - setUpEntityProfileRequest(); - - entityProfileRequest1_1.writeTo(output1_0); - - StreamInput streamInput = output1_0.bytes().streamInput(); - streamInput.setVersion(Version.V_1_0_0); - EntityProfileRequest1_0 readRequest = new EntityProfileRequest1_0(streamInput); - assertThat(readRequest.getAdID(), equalTo(detectorId)); - assertThat(readRequest.getEntityValue(), equalTo(entity.toString())); - assertThat(readRequest.getProfilesToCollect(), equalTo(profilesToCollect)); - } - - private void setUpEntityProfileResponse() { - long lastActiveTimestamp = 10L; - EntityProfileResponse.Builder builder = new EntityProfileResponse.Builder(); - builder.setLastActiveMs(lastActiveTimestamp).build(); - ModelProfile modelProfile = new ModelProfile(modelId, entity, modelSize); - ModelProfileOnNode model = new ModelProfileOnNode(nodeId, modelProfile); - builder.setModelProfile(model); - entityProfileResponse1_1 = builder.build(); - - EntityProfileResponse1_0.Builder builder1_0 = new EntityProfileResponse1_0.Builder(); - builder1_0.setLastActiveMs(lastActiveTimestamp).build(); - ModelProfile1_0 modelProfile1_0 = new ModelProfile1_0(modelId, modelSize, nodeId); - builder1_0.setModelProfile(modelProfile1_0); - entityProfileResponse1_0 = builder1_0.build(); - ModelProfile convertedModelProfile = new ModelProfile(modelId, null, modelSize); - convertedModelProfileOnNode = new ModelProfileOnNode(CommonName.EMPTY_FIELD, convertedModelProfile); - } - - /** - * For EntityProfileResponse, the input is a 1.1 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testDeserializeEntityProfileResponse1_1() throws IOException { - setUpEntityProfileResponse(); - - entityProfileResponse1_1.writeTo(output1_1); - - StreamInput streamInput = output1_1.bytes().streamInput(); - streamInput.setVersion(V_1_1_0); - EntityProfileResponse readResponse = EntityProfileAction.INSTANCE.getResponseReader().read(streamInput); - assertThat(readResponse.getModelProfile(), equalTo(entityProfileResponse1_1.getModelProfile())); - assertThat(readResponse.getLastActiveMs(), equalTo(entityProfileResponse1_1.getLastActiveMs())); - assertThat(readResponse.getTotalUpdates(), equalTo(entityProfileResponse1_0.getTotalUpdates())); - } - - /** - * For EntityProfileResponse, the input is a 1.0 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testDeserializeEntityProfileResponse1_0() throws IOException { - setUpEntityProfileResponse(); - - entityProfileResponse1_0.writeTo(output1_0); - - StreamInput streamInput = output1_0.bytes().streamInput(); - streamInput.setVersion(Version.V_1_0_0); - EntityProfileResponse readResponse = EntityProfileAction.INSTANCE.getResponseReader().read(streamInput); - assertThat(readResponse.getModelProfile(), equalTo(convertedModelProfileOnNode)); - assertThat(readResponse.getLastActiveMs(), equalTo(entityProfileResponse1_0.getLastActiveMs())); - assertThat(readResponse.getTotalUpdates(), equalTo(entityProfileResponse1_0.getTotalUpdates())); - } - - /** - * For EntityProfileResponse, the output is a 1.0 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testSerializeEntityProfileResponse1_0() throws IOException { - setUpEntityProfileResponse(); - - entityProfileResponse1_1.writeTo(output1_0); - - StreamInput streamInput = output1_0.bytes().streamInput(); - streamInput.setVersion(Version.V_1_0_0); - EntityProfileResponse1_0 readResponse = new EntityProfileResponse1_0(streamInput); - assertThat(readResponse.getModelProfile(), equalTo(new ModelProfile1_0(modelId, modelSize, CommonName.EMPTY_FIELD))); - assertThat(readResponse.getLastActiveMs(), equalTo(entityProfileResponse1_1.getLastActiveMs())); - assertThat(readResponse.getTotalUpdates(), equalTo(entityProfileResponse1_0.getTotalUpdates())); - } - - @SuppressWarnings("serial") - private void setUpProfileResponse() { - String node1 = "node1"; - String nodeName1 = "nodename1"; - DiscoveryNode discoveryNode1_1 = new DiscoveryNode( - nodeName1, - node1, - new TransportAddress(TransportAddress.META_ADDRESS, 9300), - emptyMap(), - emptySet(), - V_1_1_0 - ); - - String node2 = "node2"; - String nodeName2 = "nodename2"; - DiscoveryNode discoveryNode2 = new DiscoveryNode( - nodeName2, - node2, - new TransportAddress(TransportAddress.META_ADDRESS, 9301), - emptyMap(), - emptySet(), - V_1_1_0 - ); - - String model1Id = "model1"; - String model2Id = "model2"; - - Map modelSizeMap1 = new HashMap() { - { - put(model1Id, modelSize); - put(model2Id, modelSize2); - } - }; - Map modelSizeMap2 = new HashMap(); - - int shingleSize = 8; - - ModelProfile modelProfile = new ModelProfile(model1Id, entity, modelSize); - ModelProfile modelProfile2 = new ModelProfile(model2Id, entity2, modelSize2); - - ProfileNodeResponse profileNodeResponse1 = new ProfileNodeResponse( - discoveryNode1_1, - modelSizeMap1, - shingleSize, - 0, - 0, - Arrays.asList(modelProfile, modelProfile2), - modelSizeMap1.size() - ); - ProfileNodeResponse profileNodeResponse2 = new ProfileNodeResponse( - discoveryNode2, - modelSizeMap2, - -1, - 0, - 0, - new ArrayList<>(), - modelSizeMap2.size() - ); - List profileNodeResponses = Arrays.asList(profileNodeResponse1, profileNodeResponse2); - List failures = Collections.emptyList(); - - ClusterName clusterName = new ClusterName("test-cluster-name"); - profileResponse1_1 = new ProfileResponse(clusterName, profileNodeResponses, failures); - - ProfileNodeResponse1_0 profileNodeResponse1_1_0 = new ProfileNodeResponse1_0(discoveryNode1_1, modelSizeMap1, shingleSize, 0, 0); - ProfileNodeResponse1_0 profileNodeResponse2_1_0 = new ProfileNodeResponse1_0(discoveryNode2, modelSizeMap2, -1, 0, 0); - List profileNodeResponses1_0 = Arrays.asList(profileNodeResponse1_1_0, profileNodeResponse2_1_0); - profileResponse1_0 = new ProfileResponse1_0(clusterName, profileNodeResponses1_0, failures); - - convertedModelProfileOnNodeArray = new ModelProfileOnNode[2]; - ModelProfile convertedModelProfile1 = new ModelProfile(model1Id, null, modelSize); - convertedModelProfileOnNodeArray[0] = new ModelProfileOnNode(CommonName.EMPTY_FIELD, convertedModelProfile1); - - ModelProfile convertedModelProfile2 = new ModelProfile(model2Id, null, modelSize2); - convertedModelProfileOnNodeArray[1] = new ModelProfileOnNode(CommonName.EMPTY_FIELD, convertedModelProfile2); - - convertedModelProfile = new ModelProfile1_0[2]; - convertedModelProfile[0] = new ModelProfile1_0(model1Id, modelSize, CommonName.EMPTY_FIELD); - convertedModelProfile[1] = new ModelProfile1_0(model2Id, modelSize2, CommonName.EMPTY_FIELD); - } - - /** - * For ProfileResponse, the input is a 1.1 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testDeserializeProfileResponse1_1() throws IOException { - setUpProfileResponse(); - - profileResponse1_1.writeTo(output1_1); - - StreamInput streamInput = output1_1.bytes().streamInput(); - streamInput.setVersion(V_1_1_0); - ProfileResponse readResponse = new ProfileResponse(streamInput); - assertThat(readResponse.getModelProfile(), equalTo(profileResponse1_1.getModelProfile())); - assertThat(readResponse.getShingleSize(), equalTo(profileResponse1_1.getShingleSize())); - assertThat(readResponse.getActiveEntities(), equalTo(profileResponse1_1.getActiveEntities())); - assertThat(readResponse.getTotalUpdates(), equalTo(profileResponse1_1.getTotalUpdates())); - assertThat(readResponse.getCoordinatingNode(), equalTo(profileResponse1_1.getCoordinatingNode())); - assertThat(readResponse.getTotalSizeInBytes(), equalTo(profileResponse1_1.getTotalSizeInBytes())); - assertThat(readResponse.getModelCount(), equalTo(profileResponse1_1.getModelCount())); - } - - /** - * For ProfileResponse, the input is a 1.0 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testDeserializeProfileResponse1_0() throws IOException { - setUpProfileResponse(); - - profileResponse1_0.writeTo(output1_0); - - StreamInput streamInput = output1_0.bytes().streamInput(); - streamInput.setVersion(Version.V_1_0_0); - ProfileResponse readResponse = new ProfileResponse(streamInput); - ModelProfileOnNode[] actualModelProfileOnNode = readResponse.getModelProfile(); - - // since ProfileResponse1_0's constructor iterates modelSize and modelSize is - // a HashMap. The iteration order is not deterministic. We have to sort the - // results in an ordered fashion to compare with expected value. - Arrays.sort(actualModelProfileOnNode, new Comparator() { - @Override - public int compare(ModelProfileOnNode o1, ModelProfileOnNode o2) { - return o1.getModelId().compareTo(o2.getModelId()); - } - - }); - assertThat(actualModelProfileOnNode, equalTo(convertedModelProfileOnNodeArray)); - assertThat(readResponse.getShingleSize(), equalTo(profileResponse1_1.getShingleSize())); - assertThat(readResponse.getActiveEntities(), equalTo(profileResponse1_1.getActiveEntities())); - assertThat(readResponse.getTotalUpdates(), equalTo(profileResponse1_1.getTotalUpdates())); - assertThat(readResponse.getCoordinatingNode(), equalTo(profileResponse1_1.getCoordinatingNode())); - assertThat(readResponse.getTotalSizeInBytes(), equalTo(profileResponse1_1.getTotalSizeInBytes())); - assertThat(readResponse.getModelCount(), equalTo(0L)); - } - - /** - * For ProfileResponse, the output is a 1.0 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testSerializeProfileResponse1_0() throws IOException { - setUpProfileResponse(); - - profileResponse1_1.writeTo(output1_0); - - StreamInput streamInput = output1_0.bytes().streamInput(); - streamInput.setVersion(Version.V_1_0_0); - ProfileResponse1_0 readResponse = new ProfileResponse1_0(streamInput); - assertThat(readResponse.getModelProfile(), equalTo(convertedModelProfile)); - assertThat(readResponse.getShingleSize(), equalTo(profileResponse1_1.getShingleSize())); - assertThat(readResponse.getActiveEntities(), equalTo(profileResponse1_1.getActiveEntities())); - assertThat(readResponse.getTotalUpdates(), equalTo(profileResponse1_1.getTotalUpdates())); - assertThat(readResponse.getCoordinatingNode(), equalTo(profileResponse1_1.getCoordinatingNode())); - assertThat(readResponse.getTotalSizeInBytes(), equalTo(profileResponse1_1.getTotalSizeInBytes())); - } - - /** - * jacoco complained line coverage is 0.5, but we only have one line method - * that is covered. It flags the class name not covered. - * Use solution mentioned in https://tinyurl.com/2pttzsd3 - */ - @SuppressWarnings("static-access") - public void testBwcInstance() { - Bwc bwc = new Bwc(); - assertNotNull(bwc); - } - - private void setUpRCFResultResponse() { - rcfResultResponse1_1 = new RCFResultResponse( - 0.345, - 0.123, - 30, - new double[] { 0.3, 0.7 }, - 134, - 0.4, - Version.CURRENT, - randomIntBetween(-3, 0), - new double[] { randomDoubleBetween(0, 1.0, true), randomDoubleBetween(0, 1.0, true) }, - new double[][] { new double[] { randomDouble(), randomDouble() } }, - new double[] { randomDoubleBetween(0, 1.0, true), randomDoubleBetween(0, 1.0, true) }, - randomDoubleBetween(1.1, 10.0, true) - ); - rcfResultResponse1_0 = new RCFResultResponse1_0(0.345, 0.123, 30, new double[] { 0.3, 0.7 }); - } - - /** - * For RCFResultResponse, the input is a 1.1 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testDeserializeRCFResultResponse1_1() throws IOException { - setUpRCFResultResponse(); - - rcfResultResponse1_1.writeTo(output1_1); - - StreamInput streamInput = output1_1.bytes().streamInput(); - streamInput.setVersion(V_1_1_0); - RCFResultResponse readResponse = new RCFResultResponse(streamInput); - assertArrayEquals(readResponse.getAttribution(), rcfResultResponse1_1.getAttribution(), 0.001); - assertThat(readResponse.getConfidence(), equalTo(rcfResultResponse1_1.getConfidence())); - assertThat(readResponse.getForestSize(), equalTo(rcfResultResponse1_1.getForestSize())); - assertThat(readResponse.getTotalUpdates(), equalTo(rcfResultResponse1_1.getTotalUpdates())); - assertThat(readResponse.getRCFScore(), equalTo(rcfResultResponse1_1.getRCFScore())); - } - - /** - * For RCFResultResponse, the input is a 1.0 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testDeserializeRCFResultResponse1_0() throws IOException { - setUpRCFResultResponse(); - - rcfResultResponse1_0.writeTo(output1_0); - - StreamInput streamInput = output1_0.bytes().streamInput(); - streamInput.setVersion(Version.V_1_0_0); - RCFResultResponse readResponse = new RCFResultResponse(streamInput); - assertArrayEquals(readResponse.getAttribution(), rcfResultResponse1_0.getAttribution(), 0.001); - assertThat(readResponse.getConfidence(), equalTo(rcfResultResponse1_0.getConfidence())); - assertThat(readResponse.getForestSize(), equalTo(rcfResultResponse1_0.getForestSize())); - assertThat(readResponse.getTotalUpdates(), equalTo(0L)); - assertThat(readResponse.getRCFScore(), equalTo(rcfResultResponse1_0.getRCFScore())); - } - - /** - * For RCFResultResponse, the output is a 1.0 stream. - * @throws IOException when serialization/deserialization has issues. - */ - public void testSerializeRCFResultResponse1_0() throws IOException { - setUpRCFResultResponse(); - - rcfResultResponse1_1.writeTo(output1_0); - - StreamInput streamInput = output1_0.bytes().streamInput(); - streamInput.setVersion(Version.V_1_0_0); - RCFResultResponse1_0 readResponse = new RCFResultResponse1_0(streamInput); - assertArrayEquals(readResponse.getAttribution(), rcfResultResponse1_0.getAttribution(), 0.001); - assertThat(readResponse.getConfidence(), equalTo(rcfResultResponse1_0.getConfidence())); - assertThat(readResponse.getForestSize(), equalTo(rcfResultResponse1_0.getForestSize())); - assertThat(readResponse.getRCFScore(), equalTo(rcfResultResponse1_0.getRCFScore())); - } -} diff --git a/src/test/java/org/opensearch/EntityProfileRequest1_0.java b/src/test/java/org/opensearch/EntityProfileRequest1_0.java deleted file mode 100644 index c03dd1158..000000000 --- a/src/test/java/org/opensearch/EntityProfileRequest1_0.java +++ /dev/null @@ -1,105 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch; - -import static org.opensearch.action.ValidateActions.addValidationError; - -import java.io.IOException; -import java.util.HashSet; -import java.util.Set; - -import org.opensearch.action.ActionRequest; -import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.model.EntityProfileName; -import org.opensearch.core.common.Strings; -import org.opensearch.core.common.io.stream.StreamInput; -import org.opensearch.core.common.io.stream.StreamOutput; -import org.opensearch.core.xcontent.ToXContentObject; -import org.opensearch.core.xcontent.XContentBuilder; - -public class EntityProfileRequest1_0 extends ActionRequest implements ToXContentObject { - public static final String ENTITY = "entity"; - public static final String PROFILES = "profiles"; - private String adID; - private String entityValue; - private Set profilesToCollect; - - public EntityProfileRequest1_0(StreamInput in) throws IOException { - super(in); - adID = in.readString(); - entityValue = in.readString(); - int size = in.readVInt(); - profilesToCollect = new HashSet(); - if (size != 0) { - for (int i = 0; i < size; i++) { - profilesToCollect.add(in.readEnum(EntityProfileName.class)); - } - } - } - - public EntityProfileRequest1_0(String adID, String entityValue, Set profilesToCollect) { - super(); - this.adID = adID; - this.entityValue = entityValue; - this.profilesToCollect = profilesToCollect; - } - - public String getAdID() { - return adID; - } - - public String getEntityValue() { - return entityValue; - } - - public Set getProfilesToCollect() { - return profilesToCollect; - } - - @Override - public void writeTo(StreamOutput out) throws IOException { - super.writeTo(out); - out.writeString(adID); - out.writeString(entityValue); - out.writeVInt(profilesToCollect.size()); - for (EntityProfileName profile : profilesToCollect) { - out.writeEnum(profile); - } - } - - @Override - public ActionRequestValidationException validate() { - ActionRequestValidationException validationException = null; - if (Strings.isEmpty(adID)) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); - } - if (Strings.isEmpty(entityValue)) { - validationException = addValidationError("Entity value is missing", validationException); - } - if (profilesToCollect == null || profilesToCollect.isEmpty()) { - validationException = addValidationError(CommonErrorMessages.EMPTY_PROFILES_COLLECT, validationException); - } - return validationException; - } - - @Override - public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { - builder.startObject(); - builder.field(CommonName.ID_JSON_KEY, adID); - builder.field(ENTITY, entityValue); - builder.field(PROFILES, profilesToCollect); - builder.endObject(); - return builder; - } -} diff --git a/src/test/java/org/opensearch/EntityProfileResponse1_0.java b/src/test/java/org/opensearch/EntityProfileResponse1_0.java deleted file mode 100644 index 162d8b4d1..000000000 --- a/src/test/java/org/opensearch/EntityProfileResponse1_0.java +++ /dev/null @@ -1,172 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch; - -import java.io.IOException; -import java.util.Optional; - -import org.apache.commons.lang.builder.EqualsBuilder; -import org.apache.commons.lang.builder.HashCodeBuilder; -import org.apache.commons.lang.builder.ToStringBuilder; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.core.action.ActionResponse; -import org.opensearch.core.common.io.stream.StreamInput; -import org.opensearch.core.common.io.stream.StreamOutput; -import org.opensearch.core.xcontent.ToXContentObject; -import org.opensearch.core.xcontent.XContentBuilder; - -public class EntityProfileResponse1_0 extends ActionResponse implements ToXContentObject { - public static final String ACTIVE = "active"; - public static final String LAST_ACTIVE_TS = "last_active_timestamp"; - public static final String TOTAL_UPDATES = "total_updates"; - private final Boolean isActive; - private final long lastActiveMs; - private final long totalUpdates; - private final ModelProfile1_0 modelProfile; - - public static class Builder { - private Boolean isActive = null; - private long lastActiveMs = -1L; - private long totalUpdates = -1L; - private ModelProfile1_0 modelProfile = null; - - public Builder() {} - - public Builder setActive(Boolean isActive) { - this.isActive = isActive; - return this; - } - - public Builder setLastActiveMs(long lastActiveMs) { - this.lastActiveMs = lastActiveMs; - return this; - } - - public Builder setTotalUpdates(long totalUpdates) { - this.totalUpdates = totalUpdates; - return this; - } - - public Builder setModelProfile(ModelProfile1_0 modelProfile) { - this.modelProfile = modelProfile; - return this; - } - - public EntityProfileResponse1_0 build() { - return new EntityProfileResponse1_0(isActive, lastActiveMs, totalUpdates, modelProfile); - } - } - - public EntityProfileResponse1_0(Boolean isActive, long lastActiveTimeMs, long totalUpdates, ModelProfile1_0 modelProfile) { - this.isActive = isActive; - this.lastActiveMs = lastActiveTimeMs; - this.totalUpdates = totalUpdates; - this.modelProfile = modelProfile; - } - - public EntityProfileResponse1_0(StreamInput in) throws IOException { - super(in); - isActive = in.readOptionalBoolean(); - lastActiveMs = in.readLong(); - totalUpdates = in.readLong(); - if (in.readBoolean()) { - modelProfile = new ModelProfile1_0(in); - } else { - modelProfile = null; - } - } - - public Optional isActive() { - return Optional.ofNullable(isActive); - } - - public long getLastActiveMs() { - return lastActiveMs; - } - - public long getTotalUpdates() { - return totalUpdates; - } - - public ModelProfile1_0 getModelProfile() { - return modelProfile; - } - - @Override - public void writeTo(StreamOutput out) throws IOException { - out.writeOptionalBoolean(isActive); - out.writeLong(lastActiveMs); - out.writeLong(totalUpdates); - if (modelProfile != null) { - out.writeBoolean(true); - modelProfile.writeTo(out); - } else { - out.writeBoolean(false); - } - } - - @Override - public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { - builder.startObject(); - if (isActive != null) { - builder.field(ACTIVE, isActive); - } - if (lastActiveMs >= 0) { - builder.field(LAST_ACTIVE_TS, lastActiveMs); - } - if (totalUpdates >= 0) { - builder.field(TOTAL_UPDATES, totalUpdates); - } - if (modelProfile != null) { - builder.field(CommonName.MODEL, modelProfile); - } - builder.endObject(); - return builder; - } - - @Override - public String toString() { - ToStringBuilder builder = new ToStringBuilder(this); - builder.append(ACTIVE, isActive); - builder.append(LAST_ACTIVE_TS, lastActiveMs); - builder.append(TOTAL_UPDATES, totalUpdates); - builder.append(CommonName.MODEL, modelProfile); - - return builder.toString(); - } - - @Override - public boolean equals(Object obj) { - if (this == obj) - return true; - if (obj == null) - return false; - if (getClass() != obj.getClass()) - return false; - if (obj instanceof EntityProfileResponse1_0) { - EntityProfileResponse1_0 other = (EntityProfileResponse1_0) obj; - EqualsBuilder equalsBuilder = new EqualsBuilder(); - equalsBuilder.append(isActive, other.isActive); - equalsBuilder.append(lastActiveMs, other.lastActiveMs); - equalsBuilder.append(totalUpdates, other.totalUpdates); - equalsBuilder.append(modelProfile, other.modelProfile); - - return equalsBuilder.isEquals(); - } - return false; - } - - @Override - public int hashCode() { - return new HashCodeBuilder().append(isActive).append(lastActiveMs).append(totalUpdates).append(modelProfile).toHashCode(); - } -} diff --git a/src/test/java/org/opensearch/EntityResultRequest1_0.java b/src/test/java/org/opensearch/EntityResultRequest1_0.java deleted file mode 100644 index 30c692d9d..000000000 --- a/src/test/java/org/opensearch/EntityResultRequest1_0.java +++ /dev/null @@ -1,105 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch; - -import static org.opensearch.action.ValidateActions.addValidationError; - -import java.io.IOException; -import java.util.Locale; -import java.util.Map; - -import org.opensearch.action.ActionRequest; -import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.core.common.Strings; -import org.opensearch.core.common.io.stream.StreamInput; -import org.opensearch.core.common.io.stream.StreamOutput; -import org.opensearch.core.xcontent.ToXContentObject; -import org.opensearch.core.xcontent.XContentBuilder; - -public class EntityResultRequest1_0 extends ActionRequest implements ToXContentObject { - - private String detectorId; - private Map entities; - private long start; - private long end; - - public EntityResultRequest1_0(StreamInput in) throws IOException { - super(in); - this.detectorId = in.readString(); - this.entities = in.readMap(StreamInput::readString, StreamInput::readDoubleArray); - this.start = in.readLong(); - this.end = in.readLong(); - } - - public EntityResultRequest1_0(String detectorId, Map entities, long start, long end) { - super(); - this.detectorId = detectorId; - this.entities = entities; - this.start = start; - this.end = end; - } - - public String getDetectorId() { - return this.detectorId; - } - - public Map getEntities() { - return this.entities; - } - - public long getStart() { - return this.start; - } - - public long getEnd() { - return this.end; - } - - @Override - public void writeTo(StreamOutput out) throws IOException { - super.writeTo(out); - out.writeString(this.detectorId); - out.writeMap(this.entities, StreamOutput::writeString, StreamOutput::writeDoubleArray); - out.writeLong(this.start); - out.writeLong(this.end); - } - - @Override - public ActionRequestValidationException validate() { - ActionRequestValidationException validationException = null; - if (Strings.isEmpty(detectorId)) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); - } - if (start <= 0 || end <= 0 || start > end) { - validationException = addValidationError( - String.format(Locale.ROOT, "%s: start %d, end %d", CommonErrorMessages.INVALID_TIMESTAMP_ERR_MSG, start, end), - validationException - ); - } - return validationException; - } - - @Override - public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { - builder.startObject(); - builder.field(CommonName.ID_JSON_KEY, detectorId); - builder.field(CommonName.START_JSON_KEY, start); - builder.field(CommonName.END_JSON_KEY, end); - for (String entity : entities.keySet()) { - builder.field(entity, entities.get(entity)); - } - builder.endObject(); - return builder; - } -} diff --git a/src/test/java/org/opensearch/ModelProfile1_0.java b/src/test/java/org/opensearch/ModelProfile1_0.java deleted file mode 100644 index baebd9b99..000000000 --- a/src/test/java/org/opensearch/ModelProfile1_0.java +++ /dev/null @@ -1,114 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch; - -import java.io.IOException; - -import org.apache.commons.lang.builder.EqualsBuilder; -import org.apache.commons.lang.builder.HashCodeBuilder; -import org.apache.commons.lang.builder.ToStringBuilder; -import org.opensearch.core.common.io.stream.StreamInput; -import org.opensearch.core.common.io.stream.StreamOutput; -import org.opensearch.core.common.io.stream.Writeable; -import org.opensearch.core.xcontent.ToXContent; -import org.opensearch.core.xcontent.XContentBuilder; - -public class ModelProfile1_0 implements Writeable, ToXContent { - // field name in toXContent - public static final String MODEL_ID = "model_id"; - public static final String MODEL_SIZE_IN_BYTES = "model_size_in_bytes"; - public static final String NODE_ID = "node_id"; - - private final String modelId; - private final long modelSizeInBytes; - private final String nodeId; - - public ModelProfile1_0(String modelId, long modelSize, String nodeId) { - super(); - this.modelId = modelId; - this.modelSizeInBytes = modelSize; - this.nodeId = nodeId; - } - - public ModelProfile1_0(StreamInput in) throws IOException { - modelId = in.readString(); - modelSizeInBytes = in.readLong(); - nodeId = in.readString(); - } - - public String getModelId() { - return modelId; - } - - public long getModelSize() { - return modelSizeInBytes; - } - - public String getNodeId() { - return nodeId; - } - - @Override - public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { - builder.startObject(); - builder.field(MODEL_ID, modelId); - if (modelSizeInBytes > 0) { - builder.field(MODEL_SIZE_IN_BYTES, modelSizeInBytes); - } - builder.field(NODE_ID, nodeId); - builder.endObject(); - return builder; - } - - @Override - public void writeTo(StreamOutput out) throws IOException { - out.writeString(modelId); - out.writeLong(modelSizeInBytes); - out.writeString(nodeId); - } - - @Override - public boolean equals(Object obj) { - if (this == obj) - return true; - if (obj == null) - return false; - if (getClass() != obj.getClass()) - return false; - if (obj instanceof ModelProfile1_0) { - ModelProfile1_0 other = (ModelProfile1_0) obj; - EqualsBuilder equalsBuilder = new EqualsBuilder(); - equalsBuilder.append(modelId, other.modelId); - equalsBuilder.append(modelSizeInBytes, other.modelSizeInBytes); - equalsBuilder.append(nodeId, other.nodeId); - - return equalsBuilder.isEquals(); - } - return false; - } - - @Override - public int hashCode() { - return new HashCodeBuilder().append(modelId).append(modelSizeInBytes).append(nodeId).toHashCode(); - } - - @Override - public String toString() { - ToStringBuilder builder = new ToStringBuilder(this); - builder.append(MODEL_ID, modelId); - if (modelSizeInBytes > 0) { - builder.append(MODEL_SIZE_IN_BYTES, modelSizeInBytes); - } - builder.append(NODE_ID, nodeId); - return builder.toString(); - } -} diff --git a/src/test/java/org/opensearch/ProfileNodeResponse1_0.java b/src/test/java/org/opensearch/ProfileNodeResponse1_0.java deleted file mode 100644 index c399b6b35..000000000 --- a/src/test/java/org/opensearch/ProfileNodeResponse1_0.java +++ /dev/null @@ -1,134 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch; - -import java.io.IOException; -import java.util.Map; - -import org.opensearch.action.support.nodes.BaseNodeResponse; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.cluster.node.DiscoveryNode; -import org.opensearch.core.common.io.stream.StreamInput; -import org.opensearch.core.common.io.stream.StreamOutput; -import org.opensearch.core.xcontent.ToXContentFragment; -import org.opensearch.core.xcontent.XContentBuilder; - -/** - * Profile response on a node - */ -public class ProfileNodeResponse1_0 extends BaseNodeResponse implements ToXContentFragment { - // filed name in toXContent - static final String MODEL_SIZE_IN_BYTES = "model_size_in_bytes"; - - private Map modelSize; - private int shingleSize; - private long activeEntities; - private long totalUpdates; - - /** - * Constructor - * - * @param in StreamInput - * @throws IOException throws an IO exception if the StreamInput cannot be read from - */ - public ProfileNodeResponse1_0(StreamInput in) throws IOException { - super(in); - if (in.readBoolean()) { - modelSize = in.readMap(StreamInput::readString, StreamInput::readLong); - } - shingleSize = in.readInt(); - activeEntities = in.readVLong(); - totalUpdates = in.readVLong(); - } - - /** - * Constructor - * - * @param node DiscoveryNode object - * @param modelSize Mapping of model id to its memory consumption in bytes - * @param shingleSize shingle size - * @param activeEntity active entity count - * @param totalUpdates RCF model total updates - */ - public ProfileNodeResponse1_0(DiscoveryNode node, Map modelSize, int shingleSize, long activeEntity, long totalUpdates) { - super(node); - this.modelSize = modelSize; - this.shingleSize = shingleSize; - this.activeEntities = activeEntity; - this.totalUpdates = totalUpdates; - } - - /** - * Creates a new ProfileNodeResponse object and reads in the profile from an input stream - * - * @param in StreamInput to read from - * @return ProfileNodeResponse object corresponding to the input stream - * @throws IOException throws an IO exception if the StreamInput cannot be read from - */ - public static ProfileNodeResponse1_0 readProfiles(StreamInput in) throws IOException { - return new ProfileNodeResponse1_0(in); - } - - @Override - public void writeTo(StreamOutput out) throws IOException { - super.writeTo(out); - if (modelSize != null) { - out.writeBoolean(true); - out.writeMap(modelSize, StreamOutput::writeString, StreamOutput::writeLong); - } else { - out.writeBoolean(false); - } - - out.writeInt(shingleSize); - out.writeVLong(activeEntities); - out.writeVLong(totalUpdates); - } - - /** - * Converts profile to xContent - * - * @param builder XContentBuilder - * @param params Params - * @return XContentBuilder - * @throws IOException thrown by builder for invalid field - */ - @Override - public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { - builder.startObject(MODEL_SIZE_IN_BYTES); - for (Map.Entry entry : modelSize.entrySet()) { - builder.field(entry.getKey(), entry.getValue()); - } - builder.endObject(); - - builder.field(CommonName.SHINGLE_SIZE, shingleSize); - builder.field(CommonName.ACTIVE_ENTITIES, activeEntities); - builder.field(CommonName.TOTAL_UPDATES, totalUpdates); - - return builder; - } - - public Map getModelSize() { - return modelSize; - } - - public int getShingleSize() { - return shingleSize; - } - - public long getActiveEntities() { - return activeEntities; - } - - public long getTotalUpdates() { - return totalUpdates; - } -} diff --git a/src/test/java/org/opensearch/ProfileResponse1_0.java b/src/test/java/org/opensearch/ProfileResponse1_0.java deleted file mode 100644 index 56f13e2f9..000000000 --- a/src/test/java/org/opensearch/ProfileResponse1_0.java +++ /dev/null @@ -1,169 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch; - -import java.io.IOException; -import java.util.ArrayList; -import java.util.List; -import java.util.Map; - -import org.opensearch.action.FailedNodeException; -import org.opensearch.action.support.nodes.BaseNodesResponse; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.cluster.ClusterName; -import org.opensearch.core.common.io.stream.StreamInput; -import org.opensearch.core.common.io.stream.StreamOutput; -import org.opensearch.core.xcontent.ToXContentFragment; -import org.opensearch.core.xcontent.XContentBuilder; - -/** - * This class consists of the aggregated responses from the nodes - */ -public class ProfileResponse1_0 extends BaseNodesResponse implements ToXContentFragment { - // filed name in toXContent - static final String COORDINATING_NODE = CommonName.COORDINATING_NODE; - static final String SHINGLE_SIZE = CommonName.SHINGLE_SIZE; - static final String TOTAL_SIZE = CommonName.TOTAL_SIZE_IN_BYTES; - static final String ACTIVE_ENTITY = CommonName.ACTIVE_ENTITIES; - static final String MODELS = CommonName.MODELS; - static final String TOTAL_UPDATES = CommonName.TOTAL_UPDATES; - - private ModelProfile1_0[] modelProfile; - private int shingleSize; - private String coordinatingNode; - private long totalSizeInBytes; - private long activeEntities; - private long totalUpdates; - - /** - * Constructor - * - * @param in StreamInput - * @throws IOException thrown when unable to read from stream - */ - public ProfileResponse1_0(StreamInput in) throws IOException { - super(in); - int size = in.readVInt(); - modelProfile = new ModelProfile1_0[size]; - for (int i = 0; i < size; i++) { - modelProfile[i] = new ModelProfile1_0(in); - } - shingleSize = in.readInt(); - coordinatingNode = in.readString(); - totalSizeInBytes = in.readVLong(); - activeEntities = in.readVLong(); - totalUpdates = in.readVLong(); - } - - /** - * Constructor - * - * @param clusterName name of cluster - * @param nodes List of ProfileNodeResponse from nodes - * @param failures List of failures from nodes - */ - public ProfileResponse1_0(ClusterName clusterName, List nodes, List failures) { - super(clusterName, nodes, failures); - totalSizeInBytes = 0L; - activeEntities = 0L; - totalUpdates = 0L; - shingleSize = -1; - List modelProfileList = new ArrayList<>(); - for (ProfileNodeResponse1_0 response : nodes) { - String curNodeId = response.getNode().getId(); - if (response.getShingleSize() >= 0) { - coordinatingNode = curNodeId; - shingleSize = response.getShingleSize(); - } - if (response.getModelSize() != null) { - for (Map.Entry entry : response.getModelSize().entrySet()) { - totalSizeInBytes += entry.getValue(); - modelProfileList.add(new ModelProfile1_0(entry.getKey(), entry.getValue(), curNodeId)); - } - } - - if (response.getActiveEntities() > 0) { - activeEntities += response.getActiveEntities(); - } - if (response.getTotalUpdates() > totalUpdates) { - totalUpdates = response.getTotalUpdates(); - } - } - if (coordinatingNode == null) { - coordinatingNode = ""; - } - this.modelProfile = modelProfileList.toArray(new ModelProfile1_0[0]); - } - - @Override - public void writeTo(StreamOutput out) throws IOException { - super.writeTo(out); - out.writeVInt(modelProfile.length); - for (ModelProfile1_0 profile : modelProfile) { - profile.writeTo(out); - } - out.writeInt(shingleSize); - out.writeString(coordinatingNode); - out.writeVLong(totalSizeInBytes); - out.writeVLong(activeEntities); - out.writeVLong(totalUpdates); - } - - @Override - public void writeNodesTo(StreamOutput out, List nodes) throws IOException { - out.writeList(nodes); - } - - @Override - public List readNodesFrom(StreamInput in) throws IOException { - return in.readList(ProfileNodeResponse1_0::readProfiles); - } - - @Override - public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { - builder.field(COORDINATING_NODE, coordinatingNode); - builder.field(SHINGLE_SIZE, shingleSize); - builder.field(TOTAL_SIZE, totalSizeInBytes); - builder.field(ACTIVE_ENTITY, activeEntities); - builder.field(TOTAL_UPDATES, totalUpdates); - builder.startArray(MODELS); - for (ModelProfile1_0 profile : modelProfile) { - profile.toXContent(builder, params); - } - builder.endArray(); - return builder; - } - - public ModelProfile1_0[] getModelProfile() { - return modelProfile; - } - - public int getShingleSize() { - return shingleSize; - } - - public long getActiveEntities() { - return activeEntities; - } - - public long getTotalUpdates() { - return totalUpdates; - } - - public String getCoordinatingNode() { - return coordinatingNode; - } - - public long getTotalSizeInBytes() { - return totalSizeInBytes; - } -} diff --git a/src/test/java/org/opensearch/RCFResultResponse1_0.java b/src/test/java/org/opensearch/RCFResultResponse1_0.java deleted file mode 100644 index 9e1ecafe7..000000000 --- a/src/test/java/org/opensearch/RCFResultResponse1_0.java +++ /dev/null @@ -1,87 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch; - -import java.io.IOException; - -import org.opensearch.core.action.ActionResponse; -import org.opensearch.core.common.io.stream.StreamInput; -import org.opensearch.core.common.io.stream.StreamOutput; -import org.opensearch.core.xcontent.ToXContentObject; -import org.opensearch.core.xcontent.XContentBuilder; - -public class RCFResultResponse1_0 extends ActionResponse implements ToXContentObject { - public static final String RCF_SCORE_JSON_KEY = "rcfScore"; - public static final String CONFIDENCE_JSON_KEY = "confidence"; - public static final String FOREST_SIZE_JSON_KEY = "forestSize"; - public static final String ATTRIBUTION_JSON_KEY = "attribution"; - private double rcfScore; - private double confidence; - private int forestSize; - private double[] attribution; - - public RCFResultResponse1_0(double rcfScore, double confidence, int forestSize, double[] attribution) { - this.rcfScore = rcfScore; - this.confidence = confidence; - this.forestSize = forestSize; - this.attribution = attribution; - } - - public RCFResultResponse1_0(StreamInput in) throws IOException { - super(in); - rcfScore = in.readDouble(); - confidence = in.readDouble(); - forestSize = in.readVInt(); - attribution = in.readDoubleArray(); - } - - public double getRCFScore() { - return rcfScore; - } - - public double getConfidence() { - return confidence; - } - - public int getForestSize() { - return forestSize; - } - - /** - * Returns RCF score attribution. - * - * @return RCF score attribution. - */ - public double[] getAttribution() { - return attribution; - } - - @Override - public void writeTo(StreamOutput out) throws IOException { - out.writeDouble(rcfScore); - out.writeDouble(confidence); - out.writeVInt(forestSize); - out.writeDoubleArray(attribution); - } - - @Override - public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { - builder.startObject(); - builder.field(RCF_SCORE_JSON_KEY, rcfScore); - builder.field(CONFIDENCE_JSON_KEY, confidence); - builder.field(FOREST_SIZE_JSON_KEY, forestSize); - builder.field(ATTRIBUTION_JSON_KEY, attribution); - builder.endObject(); - return builder; - } - -} diff --git a/src/test/java/org/opensearch/StreamInputOutputTests.java b/src/test/java/org/opensearch/StreamInputOutputTests.java new file mode 100644 index 000000000..82ff5cc24 --- /dev/null +++ b/src/test/java/org/opensearch/StreamInputOutputTests.java @@ -0,0 +1,293 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch; + +import static java.util.Collections.emptyMap; +import static java.util.Collections.emptySet; +import static org.hamcrest.Matchers.equalTo; + +import java.io.IOException; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.Collections; +import java.util.HashMap; +import java.util.HashSet; +import java.util.List; +import java.util.Map; +import java.util.Set; + +import org.opensearch.action.FailedNodeException; +import org.opensearch.ad.model.EntityProfileName; +import org.opensearch.ad.model.ModelProfile; +import org.opensearch.ad.model.ModelProfileOnNode; +import org.opensearch.ad.transport.EntityProfileAction; +import org.opensearch.ad.transport.EntityProfileRequest; +import org.opensearch.ad.transport.EntityProfileResponse; +import org.opensearch.ad.transport.EntityResultRequest; +import org.opensearch.ad.transport.ProfileNodeResponse; +import org.opensearch.ad.transport.ProfileResponse; +import org.opensearch.ad.transport.RCFResultResponse; +import org.opensearch.cluster.ClusterName; +import org.opensearch.cluster.node.DiscoveryNode; +import org.opensearch.common.io.stream.BytesStreamOutput; +import org.opensearch.core.common.io.stream.StreamInput; +import org.opensearch.core.common.transport.TransportAddress; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.model.Entity; + +/** + * Put in core package so that we can using Version's package private constructor + * + */ +public class StreamInputOutputTests extends AbstractTimeSeriesTest { + // public static Version V_1_1_0 = new Version(1010099, org.apache.lucene.util.Version.LUCENE_8_8_2); + private EntityResultRequest entityResultRequest; + private String detectorId; + private long start, end; + private Map entities; + private BytesStreamOutput output; + private String categoryField, categoryValue, categoryValue2; + private double[] feature; + private EntityProfileRequest entityProfileRequest; + private Entity entity, entity2; + private Set profilesToCollect; + private String nodeId = "abc"; + private String modelId = "123"; + private long modelSize = 712480L; + private long modelSize2 = 112480L; + private EntityProfileResponse entityProfileResponse; + private ProfileResponse profileResponse; + private RCFResultResponse rcfResultResponse; + + private boolean areEqualWithArrayValue(Map first, Map second) { + if (first.size() != second.size()) { + return false; + } + + return first.entrySet().stream().allMatch(e -> Arrays.equals(e.getValue(), second.get(e.getKey()))); + } + + @Override + public void setUp() throws Exception { + super.setUp(); + + categoryField = "a"; + categoryValue = "b"; + categoryValue2 = "b2"; + + feature = new double[] { 0.3 }; + detectorId = "123"; + + entity = Entity.createSingleAttributeEntity(categoryField, categoryValue); + entity2 = Entity.createSingleAttributeEntity(categoryField, categoryValue2); + + output = new BytesStreamOutput(); + } + + private void setUpEntityResultRequest() { + entities = new HashMap<>(); + entities.put(entity, feature); + start = 10L; + end = 20L; + entityResultRequest = new EntityResultRequest(detectorId, entities, start, end); + } + + /** + * @throws IOException when serialization/deserialization has issues. + */ + public void testDeSerializeEntityResultRequest() throws IOException { + setUpEntityResultRequest(); + + entityResultRequest.writeTo(output); + + StreamInput streamInput = output.bytes().streamInput(); + EntityResultRequest readRequest = new EntityResultRequest(streamInput); + assertThat(readRequest.getId(), equalTo(detectorId)); + assertThat(readRequest.getStart(), equalTo(start)); + assertThat(readRequest.getEnd(), equalTo(end)); + assertTrue(areEqualWithArrayValue(readRequest.getEntities(), entities)); + } + + private void setUpEntityProfileRequest() { + profilesToCollect = new HashSet(); + profilesToCollect.add(EntityProfileName.STATE); + entityProfileRequest = new EntityProfileRequest(detectorId, entity, profilesToCollect); + } + + /** + * @throws IOException when serialization/deserialization has issues. + */ + public void testDeserializeEntityProfileRequest() throws IOException { + setUpEntityProfileRequest(); + + entityProfileRequest.writeTo(output); + + StreamInput streamInput = output.bytes().streamInput(); + EntityProfileRequest readRequest = new EntityProfileRequest(streamInput); + assertThat(readRequest.getAdID(), equalTo(detectorId)); + assertThat(readRequest.getEntityValue(), equalTo(entity)); + assertThat(readRequest.getProfilesToCollect(), equalTo(profilesToCollect)); + } + + private void setUpEntityProfileResponse() { + long lastActiveTimestamp = 10L; + EntityProfileResponse.Builder builder = new EntityProfileResponse.Builder(); + builder.setLastActiveMs(lastActiveTimestamp).build(); + ModelProfile modelProfile = new ModelProfile(modelId, entity, modelSize); + ModelProfileOnNode model = new ModelProfileOnNode(nodeId, modelProfile); + builder.setModelProfile(model); + entityProfileResponse = builder.build(); + } + + /** + * @throws IOException when serialization/deserialization has issues. + */ + public void testDeserializeEntityProfileResponse() throws IOException { + setUpEntityProfileResponse(); + + entityProfileResponse.writeTo(output); + + StreamInput streamInput = output.bytes().streamInput(); + EntityProfileResponse readResponse = EntityProfileAction.INSTANCE.getResponseReader().read(streamInput); + assertThat(readResponse.getModelProfile(), equalTo(entityProfileResponse.getModelProfile())); + assertThat(readResponse.getLastActiveMs(), equalTo(entityProfileResponse.getLastActiveMs())); + assertThat(readResponse.getTotalUpdates(), equalTo(entityProfileResponse.getTotalUpdates())); + } + + @SuppressWarnings("serial") + private void setUpProfileResponse() { + String node1 = "node1"; + String nodeName1 = "nodename1"; + DiscoveryNode discoveryNode1_1 = new DiscoveryNode( + nodeName1, + node1, + new TransportAddress(TransportAddress.META_ADDRESS, 9300), + emptyMap(), + emptySet(), + Version.V_2_1_0 + ); + + String node2 = "node2"; + String nodeName2 = "nodename2"; + DiscoveryNode discoveryNode2 = new DiscoveryNode( + nodeName2, + node2, + new TransportAddress(TransportAddress.META_ADDRESS, 9301), + emptyMap(), + emptySet(), + Version.V_2_1_0 + ); + + String model1Id = "model1"; + String model2Id = "model2"; + + Map modelSizeMap1 = new HashMap() { + { + put(model1Id, modelSize); + put(model2Id, modelSize2); + } + }; + Map modelSizeMap2 = new HashMap(); + + int shingleSize = 8; + + ModelProfile modelProfile = new ModelProfile(model1Id, entity, modelSize); + ModelProfile modelProfile2 = new ModelProfile(model2Id, entity2, modelSize2); + + ProfileNodeResponse profileNodeResponse1 = new ProfileNodeResponse( + discoveryNode1_1, + modelSizeMap1, + shingleSize, + 0, + 0, + Arrays.asList(modelProfile, modelProfile2), + modelSizeMap1.size() + ); + ProfileNodeResponse profileNodeResponse2 = new ProfileNodeResponse( + discoveryNode2, + modelSizeMap2, + -1, + 0, + 0, + new ArrayList<>(), + modelSizeMap2.size() + ); + ProfileNodeResponse profileNodeResponse3 = new ProfileNodeResponse( + discoveryNode2, + null, + -1, + 0, + 0, + // null model size. Test if we can handle this case + null, + modelSizeMap2.size() + ); + List profileNodeResponses = Arrays.asList(profileNodeResponse1, profileNodeResponse2, profileNodeResponse3); + List failures = Collections.emptyList(); + + ClusterName clusterName = new ClusterName("test-cluster-name"); + profileResponse = new ProfileResponse(clusterName, profileNodeResponses, failures); + } + + /** + * @throws IOException when serialization/deserialization has issues. + */ + public void testDeserializeProfileResponse() throws IOException { + setUpProfileResponse(); + + profileResponse.writeTo(output); + + StreamInput streamInput = output.bytes().streamInput(); + ProfileResponse readResponse = new ProfileResponse(streamInput); + assertThat(readResponse.getModelProfile(), equalTo(profileResponse.getModelProfile())); + assertThat(readResponse.getShingleSize(), equalTo(profileResponse.getShingleSize())); + assertThat(readResponse.getActiveEntities(), equalTo(profileResponse.getActiveEntities())); + assertThat(readResponse.getTotalUpdates(), equalTo(profileResponse.getTotalUpdates())); + assertThat(readResponse.getCoordinatingNode(), equalTo(profileResponse.getCoordinatingNode())); + assertThat(readResponse.getTotalSizeInBytes(), equalTo(profileResponse.getTotalSizeInBytes())); + assertThat(readResponse.getModelCount(), equalTo(profileResponse.getModelCount())); + } + + private void setUpRCFResultResponse() { + rcfResultResponse = new RCFResultResponse( + 0.345, + 0.123, + 30, + new double[] { 0.3, 0.7 }, + 134, + 0.4, + Version.CURRENT, + randomIntBetween(-3, 0), + new double[] { randomDoubleBetween(0, 1.0, true), randomDoubleBetween(0, 1.0, true) }, + new double[][] { new double[] { randomDouble(), randomDouble() } }, + new double[] { randomDoubleBetween(0, 1.0, true), randomDoubleBetween(0, 1.0, true) }, + randomDoubleBetween(1.1, 10.0, true) + ); + } + + /** + * @throws IOException when serialization/deserialization has issues. + */ + public void testDeserializeRCFResultResponse() throws IOException { + setUpRCFResultResponse(); + + rcfResultResponse.writeTo(output); + + StreamInput streamInput = output.bytes().streamInput(); + RCFResultResponse readResponse = new RCFResultResponse(streamInput); + assertArrayEquals(readResponse.getAttribution(), rcfResultResponse.getAttribution(), 0.001); + assertThat(readResponse.getConfidence(), equalTo(rcfResultResponse.getConfidence())); + assertThat(readResponse.getForestSize(), equalTo(rcfResultResponse.getForestSize())); + assertThat(readResponse.getTotalUpdates(), equalTo(rcfResultResponse.getTotalUpdates())); + assertThat(readResponse.getRCFScore(), equalTo(rcfResultResponse.getRCFScore())); + } +} diff --git a/src/test/java/org/opensearch/action/admin/indices/mapping/get/IndexAnomalyDetectorActionHandlerTests.java b/src/test/java/org/opensearch/action/admin/indices/mapping/get/IndexAnomalyDetectorActionHandlerTests.java index ebc36d314..aa2f30b02 100644 --- a/src/test/java/org/opensearch/action/admin/indices/mapping/get/IndexAnomalyDetectorActionHandlerTests.java +++ b/src/test/java/org/opensearch/action/admin/indices/mapping/get/IndexAnomalyDetectorActionHandlerTests.java @@ -20,7 +20,6 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; import java.io.IOException; import java.time.Clock; @@ -43,18 +42,11 @@ import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; import org.opensearch.action.support.WriteRequest; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.common.exception.ADValidationException; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.feature.SearchFeatureDao; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.rest.handler.IndexAnomalyDetectorActionHandler; import org.opensearch.ad.task.ADTaskManager; import org.opensearch.ad.transport.IndexAnomalyDetectorResponse; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.node.NodeClient; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.ClusterState; @@ -68,6 +60,13 @@ import org.opensearch.rest.RestRequest; import org.opensearch.threadpool.TestThreadPool; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.ValidationException; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.TransportService; /** @@ -77,7 +76,7 @@ * package private * */ -public class IndexAnomalyDetectorActionHandlerTests extends AbstractADTest { +public class IndexAnomalyDetectorActionHandlerTests extends AbstractTimeSeriesTest { static ThreadPool threadPool; private ThreadContext threadContext; private String TEXT_FIELD_TYPE = "text"; @@ -87,7 +86,7 @@ public class IndexAnomalyDetectorActionHandlerTests extends AbstractADTest { private SecurityClientUtil clientUtil; private TransportService transportService; private ActionListener channel; - private AnomalyDetectionIndices anomalyDetectionIndices; + private ADIndexManagement anomalyDetectionIndices; private String detectorId; private Long seqNo; private Long primaryTerm; @@ -130,8 +129,8 @@ public void setUp() throws Exception { channel = mock(ActionListener.class); - anomalyDetectionIndices = mock(AnomalyDetectionIndices.class); - when(anomalyDetectionIndices.doesAnomalyDetectorIndexExist()).thenReturn(true); + anomalyDetectionIndices = mock(ADIndexManagement.class); + when(anomalyDetectionIndices.doesConfigIndexExist()).thenReturn(true); detectorId = "123"; seqNo = 0L; @@ -184,7 +183,7 @@ public void setUp() throws Exception { // we support upto 2 category fields now public void testThreeCategoricalFields() throws IOException { expectThrows( - ADValidationException.class, + ValidationException.class, () -> TestHelpers.randomAnomalyDetectorUsingCategoryFields(detectorId, Arrays.asList("a", "b", "c")) ); } @@ -345,7 +344,7 @@ public void doE if (action.equals(SearchAction.INSTANCE)) { assertTrue(request instanceof SearchRequest); SearchRequest searchRequest = (SearchRequest) request; - if (searchRequest.indices()[0].equals(ANOMALY_DETECTORS_INDEX)) { + if (searchRequest.indices()[0].equals(CommonName.CONFIG_INDEX)) { listener.onResponse((Response) detectorResponse); } else { listener.onResponse((Response) userIndexResponse); @@ -424,8 +423,7 @@ private void testUpdateTemplate(String fieldTypeName) throws IOException { int totalHits = 9; when(detectorResponse.getHits()).thenReturn(TestHelpers.createSearchHits(totalHits)); - GetResponse getDetectorResponse = TestHelpers - .createGetResponse(detector, detector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX); + GetResponse getDetectorResponse = TestHelpers.createGetResponse(detector, detector.getId(), CommonName.CONFIG_INDEX); SearchResponse userIndexResponse = mock(SearchResponse.class); int userIndexHits = 0; @@ -444,7 +442,7 @@ public void doE if (action.equals(SearchAction.INSTANCE)) { assertTrue(request instanceof SearchRequest); SearchRequest searchRequest = (SearchRequest) request; - if (searchRequest.indices()[0].equals(ANOMALY_DETECTORS_INDEX)) { + if (searchRequest.indices()[0].equals(CommonName.CONFIG_INDEX)) { listener.onResponse((Response) detectorResponse); } else { listener.onResponse((Response) userIndexResponse); @@ -541,7 +539,7 @@ public void doE if (action.equals(SearchAction.INSTANCE)) { assertTrue(request instanceof SearchRequest); SearchRequest searchRequest = (SearchRequest) request; - if (searchRequest.indices()[0].equals(ANOMALY_DETECTORS_INDEX)) { + if (searchRequest.indices()[0].equals(CommonName.CONFIG_INDEX)) { listener.onResponse((Response) detectorResponse); } else { listener.onResponse((Response) userIndexResponse); @@ -622,7 +620,7 @@ public void testTenMultiEntityDetectorsUpdateSingleEntityAdToMulti() throws IOEx int totalHits = 10; AnomalyDetector existingDetector = TestHelpers.randomAnomalyDetectorUsingCategoryFields(detectorId, null); GetResponse getDetectorResponse = TestHelpers - .createGetResponse(existingDetector, existingDetector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX); + .createGetResponse(existingDetector, existingDetector.getId(), CommonName.CONFIG_INDEX); SearchResponse searchResponse = mock(SearchResponse.class); when(searchResponse.getHits()).thenReturn(TestHelpers.createSearchHits(totalHits)); @@ -705,8 +703,7 @@ public void testTenMultiEntityDetectorsUpdateSingleEntityAdToMulti() throws IOEx public void testTenMultiEntityDetectorsUpdateExistingMultiEntityAd() throws IOException { int totalHits = 10; AnomalyDetector detector = TestHelpers.randomAnomalyDetectorUsingCategoryFields(detectorId, Arrays.asList("a")); - GetResponse getDetectorResponse = TestHelpers - .createGetResponse(detector, detector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX); + GetResponse getDetectorResponse = TestHelpers.createGetResponse(detector, detector.getId(), CommonName.CONFIG_INDEX); SearchResponse searchResponse = mock(SearchResponse.class); when(searchResponse.getHits()).thenReturn(TestHelpers.createSearchHits(totalHits)); diff --git a/src/test/java/org/opensearch/action/admin/indices/mapping/get/ValidateAnomalyDetectorActionHandlerTests.java b/src/test/java/org/opensearch/action/admin/indices/mapping/get/ValidateAnomalyDetectorActionHandlerTests.java index 88aac7bbd..4873d1501 100644 --- a/src/test/java/org/opensearch/action/admin/indices/mapping/get/ValidateAnomalyDetectorActionHandlerTests.java +++ b/src/test/java/org/opensearch/action/admin/indices/mapping/get/ValidateAnomalyDetectorActionHandlerTests.java @@ -31,20 +31,13 @@ import org.mockito.MockitoAnnotations; import org.opensearch.action.search.SearchResponse; import org.opensearch.action.support.WriteRequest; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.common.exception.ADValidationException; -import org.opensearch.ad.feature.SearchFeatureDao; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.ValidationAspect; import org.opensearch.ad.rest.handler.AbstractAnomalyDetectorActionHandler; import org.opensearch.ad.rest.handler.IndexAnomalyDetectorActionHandler; import org.opensearch.ad.rest.handler.ValidateAnomalyDetectorActionHandler; import org.opensearch.ad.task.ADTaskManager; import org.opensearch.ad.transport.ValidateAnomalyDetectorResponse; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.client.node.NodeClient; import org.opensearch.cluster.service.ClusterService; @@ -54,17 +47,24 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.rest.RestRequest; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.ValidationException; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.model.ValidationAspect; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.TransportService; import com.google.common.collect.ImmutableList; -public class ValidateAnomalyDetectorActionHandlerTests extends AbstractADTest { +public class ValidateAnomalyDetectorActionHandlerTests extends AbstractTimeSeriesTest { protected AbstractAnomalyDetectorActionHandler handler; protected ClusterService clusterService; protected ActionListener channel; protected TransportService transportService; - protected AnomalyDetectionIndices anomalyDetectionIndices; + protected ADIndexManagement anomalyDetectionIndices; protected String detectorId; protected Long seqNo; protected Long primaryTerm; @@ -98,8 +98,8 @@ public void setUp() throws Exception { channel = mock(ActionListener.class); transportService = mock(TransportService.class); - anomalyDetectionIndices = mock(AnomalyDetectionIndices.class); - when(anomalyDetectionIndices.doesAnomalyDetectorIndexExist()).thenReturn(true); + anomalyDetectionIndices = mock(ADIndexManagement.class); + when(anomalyDetectionIndices.doesConfigIndexExist()).thenReturn(true); detectorId = "123"; seqNo = 0L; @@ -170,7 +170,7 @@ public void testValidateMoreThanThousandSingleEntityDetectorLimit() throws IOExc verify(clientSpy, never()).execute(eq(GetMappingsAction.INSTANCE), any(), any()); verify(channel).onFailure(response.capture()); Exception value = response.getValue(); - assertTrue(value instanceof ADValidationException); + assertTrue(value instanceof ValidationException); String errorMsg = String .format( Locale.ROOT, @@ -224,7 +224,7 @@ public void testValidateMoreThanTenMultiEntityDetectorsLimit() throws IOExceptio verify(clientSpy, never()).execute(eq(GetMappingsAction.INSTANCE), any(), any()); verify(channel).onFailure(response.capture()); Exception value = response.getValue(); - assertTrue(value instanceof ADValidationException); + assertTrue(value instanceof ValidationException); String errorMsg = String .format( Locale.ROOT, diff --git a/src/test/java/org/opensearch/ad/ADIntegTestCase.java b/src/test/java/org/opensearch/ad/ADIntegTestCase.java index e44091b27..992f137f5 100644 --- a/src/test/java/org/opensearch/ad/ADIntegTestCase.java +++ b/src/test/java/org/opensearch/ad/ADIntegTestCase.java @@ -11,9 +11,8 @@ package org.opensearch.ad; -import static org.opensearch.ad.AbstractADTest.LOG; -import static org.opensearch.ad.util.RestHandlerUtils.XCONTENT_WITH_TYPE; import static org.opensearch.common.xcontent.XContentFactory.jsonBuilder; +import static org.opensearch.timeseries.util.RestHandlerUtils.XCONTENT_WITH_TYPE; import java.io.IOException; import java.time.Instant; @@ -24,6 +23,8 @@ import java.util.List; import java.util.Map; +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.core.Logger; import org.junit.Before; import org.opensearch.action.admin.cluster.settings.ClusterUpdateSettingsRequest; import org.opensearch.action.admin.cluster.settings.ClusterUpdateSettingsResponse; @@ -39,15 +40,12 @@ import org.opensearch.action.search.SearchResponse; import org.opensearch.action.support.WriteRequest; import org.opensearch.action.support.master.AcknowledgedResponse; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.mock.plugin.MockReindexPlugin; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.model.Feature; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Client; import org.opensearch.client.node.NodeClient; import org.opensearch.cluster.node.DiscoveryNode; @@ -62,10 +60,17 @@ import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.test.OpenSearchIntegTestCase; import org.opensearch.test.transport.MockTransportService; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.util.RestHandlerUtils; import com.google.common.collect.ImmutableMap; public abstract class ADIntegTestCase extends OpenSearchIntegTestCase { + protected static final Logger LOG = (Logger) LogManager.getLogger(ADIntegTestCase.class); private long timeout = 5_000; protected String timeField = "timestamp"; @@ -77,11 +82,11 @@ public abstract class ADIntegTestCase extends OpenSearchIntegTestCase { @Override protected Collection> nodePlugins() { - return Collections.singletonList(AnomalyDetectorPlugin.class); + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); } protected Collection> transportClientPlugins() { - return Collections.singletonList(AnomalyDetectorPlugin.class); + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); } @Override @@ -101,43 +106,43 @@ public void setUp() throws Exception { public void createDetectors(List detectors, boolean createIndexFirst) throws IOException { if (createIndexFirst) { - createIndex(AnomalyDetector.ANOMALY_DETECTORS_INDEX, AnomalyDetectionIndices.getAnomalyDetectorMappings()); + createIndex(CommonName.CONFIG_INDEX, ADIndexManagement.getConfigMappings()); } for (AnomalyDetector detector : detectors) { - indexDoc(AnomalyDetector.ANOMALY_DETECTORS_INDEX, detector.toXContent(jsonBuilder(), XCONTENT_WITH_TYPE)); + indexDoc(CommonName.CONFIG_INDEX, detector.toXContent(jsonBuilder(), XCONTENT_WITH_TYPE)); } } public String createDetector(AnomalyDetector detector) throws IOException { - return indexDoc(AnomalyDetector.ANOMALY_DETECTORS_INDEX, detector.toXContent(jsonBuilder(), XCONTENT_WITH_TYPE)); + return indexDoc(CommonName.CONFIG_INDEX, detector.toXContent(jsonBuilder(), XCONTENT_WITH_TYPE)); } public String createADResult(AnomalyResult adResult) throws IOException { - return indexDoc(CommonName.ANOMALY_RESULT_INDEX_ALIAS, adResult.toXContent(jsonBuilder(), XCONTENT_WITH_TYPE)); + return indexDoc(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, adResult.toXContent(jsonBuilder(), XCONTENT_WITH_TYPE)); } public String createADTask(ADTask adTask) throws IOException { if (adTask.getTaskId() != null) { - return indexDoc(CommonName.DETECTION_STATE_INDEX, adTask.getTaskId(), adTask.toXContent(jsonBuilder(), XCONTENT_WITH_TYPE)); + return indexDoc(ADCommonName.DETECTION_STATE_INDEX, adTask.getTaskId(), adTask.toXContent(jsonBuilder(), XCONTENT_WITH_TYPE)); } - return indexDoc(CommonName.DETECTION_STATE_INDEX, adTask.toXContent(jsonBuilder(), XCONTENT_WITH_TYPE)); + return indexDoc(ADCommonName.DETECTION_STATE_INDEX, adTask.toXContent(jsonBuilder(), XCONTENT_WITH_TYPE)); } public void createDetectorIndex() throws IOException { - createIndex(AnomalyDetector.ANOMALY_DETECTORS_INDEX, AnomalyDetectionIndices.getAnomalyDetectorMappings()); + createIndex(CommonName.CONFIG_INDEX, ADIndexManagement.getConfigMappings()); } public void createADResultIndex() throws IOException { - createIndex(CommonName.ANOMALY_RESULT_INDEX_ALIAS, AnomalyDetectionIndices.getAnomalyResultMappings()); + createIndex(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, ADIndexManagement.getResultMappings()); } public void createCustomADResultIndex(String indexName) throws IOException { - createIndex(indexName, AnomalyDetectionIndices.getAnomalyResultMappings()); + createIndex(indexName, ADIndexManagement.getResultMappings()); } public void createDetectionStateIndex() throws IOException { - createIndex(CommonName.DETECTION_STATE_INDEX, AnomalyDetectionIndices.getDetectionStateMappings()); + createIndex(ADCommonName.DETECTION_STATE_INDEX, ADIndexManagement.getStateMappings()); } public void createTestDataIndex(String indexName) { @@ -159,7 +164,7 @@ public void createIndex(String indexName, String mappings) { } public AcknowledgedResponse deleteDetectorIndex() { - return deleteIndex(AnomalyDetector.ANOMALY_DETECTORS_INDEX); + return deleteIndex(CommonName.CONFIG_INDEX); } public AcknowledgedResponse deleteIndex(String indexName) { @@ -208,7 +213,7 @@ public BulkResponse bulkIndexObjects(String indexName, Li } catch (Exception e) { String error = "Failed to prepare request to bulk index docs"; LOG.error(error, e); - throw new AnomalyDetectionException(error); + throw new TimeSeriesException(error); } }); return client().bulk(bulkRequestBuilder.request()).actionGet(timeout); @@ -238,7 +243,7 @@ public long countDetectorDocs(String detectorId) { SearchRequest request = new SearchRequest(); SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder(); searchSourceBuilder.query(new TermQueryBuilder("detector_id", detectorId)).size(10); - request.indices(CommonName.DETECTION_STATE_INDEX).source(searchSourceBuilder); + request.indices(ADCommonName.DETECTION_STATE_INDEX).source(searchSourceBuilder); SearchResponse searchResponse = client().search(request).actionGet(timeout); return searchResponse.getHits().getTotalHits().value; } diff --git a/src/test/java/org/opensearch/ad/AbstractProfileRunnerTests.java b/src/test/java/org/opensearch/ad/AbstractProfileRunnerTests.java index bb99c238c..a98eef88d 100644 --- a/src/test/java/org/opensearch/ad/AbstractProfileRunnerTests.java +++ b/src/test/java/org/opensearch/ad/AbstractProfileRunnerTests.java @@ -36,17 +36,19 @@ import org.opensearch.ad.model.DetectorProfileName; import org.opensearch.ad.task.ADTaskManager; import org.opensearch.ad.transport.AnomalyResultTests; -import org.opensearch.ad.util.DiscoveryNodeFilterer; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.ClusterState; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.service.ClusterService; import org.opensearch.core.common.transport.TransportAddress; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.TransportService; -public class AbstractProfileRunnerTests extends AbstractADTest { +public class AbstractProfileRunnerTests extends AbstractTimeSeriesTest { protected enum DetectorStatus { INDEX_NOT_EXIST, NO_DOC, diff --git a/src/test/java/org/opensearch/ad/AnomalyDetectorJobRunnerTests.java b/src/test/java/org/opensearch/ad/AnomalyDetectorJobRunnerTests.java index 9214915de..ed5be8fb0 100644 --- a/src/test/java/org/opensearch/ad/AnomalyDetectorJobRunnerTests.java +++ b/src/test/java/org/opensearch/ad/AnomalyDetectorJobRunnerTests.java @@ -22,9 +22,8 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.NUM_MIN_SAMPLES; import static org.opensearch.index.seqno.SequenceNumbers.UNASSIGNED_SEQ_NO; +import static org.opensearch.timeseries.settings.TimeSeriesSettings.NUM_MIN_SAMPLES; import java.io.IOException; import java.time.Instant; @@ -53,23 +52,16 @@ import org.opensearch.action.index.IndexRequest; import org.opensearch.action.index.IndexResponse; import org.opensearch.action.search.SearchResponse; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.model.FeatureData; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.task.ADTaskCacheManager; import org.opensearch.ad.task.ADTaskManager; import org.opensearch.ad.transport.AnomalyResultAction; import org.opensearch.ad.transport.AnomalyResultResponse; import org.opensearch.ad.transport.handler.AnomalyIndexHandler; -import org.opensearch.ad.util.ClientUtil; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.ClusterSettings; @@ -90,10 +82,24 @@ import org.opensearch.jobscheduler.spi.schedule.Schedule; import org.opensearch.jobscheduler.spi.utils.LockService; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.FeatureData; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.ClientUtil; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; import com.google.common.collect.ImmutableList; -public class AnomalyDetectorJobRunnerTests extends AbstractADTest { +public class AnomalyDetectorJobRunnerTests extends AbstractTimeSeriesTest { @Mock private Client client; @@ -107,7 +113,7 @@ public class AnomalyDetectorJobRunnerTests extends AbstractADTest { private LockService lockService; @Mock - private AnomalyDetectorJob jobParameter; + private Job jobParameter; @Mock private JobExecutionContext context; @@ -141,7 +147,7 @@ public class AnomalyDetectorJobRunnerTests extends AbstractADTest { @Mock private NodeStateManager nodeStateManager; - private AnomalyDetectionIndices anomalyDetectionIndices; + private ADIndexManagement anomalyDetectionIndices; @BeforeClass public static void setUpBeforeClass() { @@ -179,7 +185,7 @@ public void setup() throws Exception { runner.setSettings(settings); - anomalyDetectionIndices = mock(AnomalyDetectionIndices.class); + anomalyDetectionIndices = mock(ADIndexManagement.class); runner.setAnomalyDetectionIndices(anomalyDetectionIndices); @@ -191,9 +197,9 @@ public void setup() throws Exception { GetRequest request = (GetRequest) args[0]; ActionListener listener = (ActionListener) args[1]; - if (request.index().equals(ANOMALY_DETECTOR_JOB_INDEX)) { - AnomalyDetectorJob job = TestHelpers.randomAnomalyDetectorJob(true); - listener.onResponse(TestHelpers.createGetResponse(job, randomAlphaOfLength(5), ANOMALY_DETECTOR_JOB_INDEX)); + if (request.index().equals(CommonName.JOB_INDEX)) { + Job job = TestHelpers.randomAnomalyDetectorJob(true); + listener.onResponse(TestHelpers.createGetResponse(job, randomAlphaOfLength(5), CommonName.JOB_INDEX)); } return null; }).when(client).get(any(), any()); @@ -215,7 +221,7 @@ public void setup() throws Exception { } assertTrue(request != null && listener != null); - ShardId shardId = new ShardId(new Index(ANOMALY_DETECTOR_JOB_INDEX, randomAlphaOfLength(10)), 0); + ShardId shardId = new ShardId(new Index(CommonName.JOB_INDEX, randomAlphaOfLength(10)), 0); listener.onResponse(new IndexResponse(shardId, request.id(), 1, 1, 1, true)); return null; @@ -225,10 +231,10 @@ public void setup() throws Exception { detector = TestHelpers.randomAnomalyDetectorWithEmptyFeature(); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); runner.setNodeStateManager(nodeStateManager); recorder = new ExecuteADResultResponseRecorder( @@ -258,7 +264,7 @@ public void tearDown() throws Exception { @Test public void testRunJobWithWrongParameterType() { expectedEx.expect(IllegalArgumentException.class); - expectedEx.expectMessage("Job parameter is not instance of AnomalyDetectorJob, type: "); + expectedEx.expectMessage("Job parameter is not instance of Job, type: "); ScheduledJobParameter parameter = mock(ScheduledJobParameter.class); when(jobParameter.getLockDurationSeconds()).thenReturn(null); @@ -375,14 +381,14 @@ public void testRunAdJobWithEndRunExceptionNowAndNotExistingDisabledAdJob() { } private void testRunAdJobWithEndRunExceptionNowAndStopAdJob(boolean jobExists, boolean jobEnabled, boolean disableSuccessfully) { - LockModel lock = new LockModel(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); + LockModel lock = new LockModel(CommonName.JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); Exception exception = new EndRunException(jobParameter.getName(), randomAlphaOfLength(5), true); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(1); GetResponse response = new GetResponse( new GetResult( - AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, + CommonName.JOB_INDEX, jobParameter.getName(), UNASSIGNED_SEQ_NO, 0, @@ -390,7 +396,7 @@ private void testRunAdJobWithEndRunExceptionNowAndStopAdJob(boolean jobExists, b jobExists, BytesReference .bytes( - new AnomalyDetectorJob( + new Job( jobParameter.getName(), jobParameter.getSchedule(), jobParameter.getWindowDelay(), @@ -400,7 +406,7 @@ private void testRunAdJobWithEndRunExceptionNowAndStopAdJob(boolean jobExists, b Instant.now(), 60L, TestHelpers.randomUser(), - jobParameter.getResultIndex() + jobParameter.getCustomResultIndex() ).toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS) ), Collections.emptyMap(), @@ -415,7 +421,7 @@ private void testRunAdJobWithEndRunExceptionNowAndStopAdJob(boolean jobExists, b doAnswer(invocation -> { IndexRequest request = invocation.getArgument(0); ActionListener listener = invocation.getArgument(1); - ShardId shardId = new ShardId(new Index(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, randomAlphaOfLength(10)), 0); + ShardId shardId = new ShardId(new Index(CommonName.JOB_INDEX, randomAlphaOfLength(10)), 0); if (disableSuccessfully) { listener.onResponse(new IndexResponse(shardId, request.id(), 1, 1, 1, true)); } else { @@ -476,7 +482,7 @@ public void testRunAdJobWithEndRunExceptionNowAndFailToGetJob() { GetRequest request = (GetRequest) args[0]; ActionListener listener = (ActionListener) args[1]; - if (request.index().equals(ANOMALY_DETECTOR_JOB_INDEX)) { + if (request.index().equals(CommonName.JOB_INDEX)) { listener.onFailure(new RuntimeException("fail to get AD job")); } return null; @@ -499,7 +505,7 @@ public void testRunAdJobWithEndRunExceptionNowAndFailToGetJob() { @Test public void testRunAdJobWithEndRunExceptionNotNowAndRetryUntilStop() throws InterruptedException { - LockModel lock = new LockModel(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); + LockModel lock = new LockModel(CommonName.JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); Instant executionStartTime = Instant.now(); Schedule schedule = mock(IntervalSchedule.class); when(jobParameter.getSchedule()).thenReturn(schedule); @@ -549,7 +555,7 @@ public Instant confirmInitializedSetup() { Collections.singletonList(new FeatureData("123", "abc", 0d)), randomAlphaOfLength(4), // not fully initialized - Long.valueOf(AnomalyDetectorSettings.NUM_MIN_SAMPLES - 1), + Long.valueOf(TimeSeriesSettings.NUM_MIN_SAMPLES - 1), randomLong(), // not an HC detector false, @@ -573,22 +579,22 @@ public void testFailtoFindDetector() { Instant executionStartTime = confirmInitializedSetup(); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onFailure(new RuntimeException()); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); - LockModel lock = new LockModel(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); + LockModel lock = new LockModel(CommonName.JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); runner.runAdJob(jobParameter, lockService, lock, Instant.now().minusSeconds(60), executionStartTime, recorder, detector); verify(client, times(1)).execute(eq(AnomalyResultAction.INSTANCE), any(), any()); verify(adTaskCacheManager, times(1)).hasQueriedResultIndex(anyString()); - verify(nodeStateManager, times(1)).getAnomalyDetector(any(String.class), any(ActionListener.class)); - verify(nodeStateManager, times(0)).getAnomalyDetectorJob(any(String.class), any(ActionListener.class)); + verify(nodeStateManager, times(1)).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); + verify(nodeStateManager, times(0)).getJob(any(String.class), any(ActionListener.class)); verify(adTaskManager, times(1)).updateLatestRealtimeTaskOnCoordinatingNode(any(), any(), any(), any(), any(), any()); assertEquals(1, testAppender.countMessage("Fail to confirm rcf update")); - assertTrue(testAppender.containExceptionMsg(AnomalyDetectionException.class, "fail to get detector")); + assertTrue(testAppender.containExceptionMsg(TimeSeriesException.class, "fail to get detector")); } @SuppressWarnings("unchecked") @@ -596,28 +602,28 @@ public void testFailtoFindJob() { Instant executionStartTime = confirmInitializedSetup(); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(1); listener.onFailure(new RuntimeException()); return null; - }).when(nodeStateManager).getAnomalyDetectorJob(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getJob(any(String.class), any(ActionListener.class)); - LockModel lock = new LockModel(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); + LockModel lock = new LockModel(CommonName.JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); runner.runAdJob(jobParameter, lockService, lock, Instant.now().minusSeconds(60), executionStartTime, recorder, detector); verify(client, times(1)).execute(eq(AnomalyResultAction.INSTANCE), any(), any()); verify(adTaskCacheManager, times(1)).hasQueriedResultIndex(anyString()); - verify(nodeStateManager, times(1)).getAnomalyDetector(any(String.class), any(ActionListener.class)); - verify(nodeStateManager, times(1)).getAnomalyDetectorJob(any(String.class), any(ActionListener.class)); + verify(nodeStateManager, times(1)).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); + verify(nodeStateManager, times(1)).getJob(any(String.class), any(ActionListener.class)); verify(adTaskManager, times(1)).updateLatestRealtimeTaskOnCoordinatingNode(any(), any(), any(), any(), any(), any()); assertEquals(1, testAppender.countMessage("Fail to confirm rcf update")); - assertTrue(testAppender.containExceptionMsg(AnomalyDetectionException.class, "fail to get job")); + assertTrue(testAppender.containExceptionMsg(TimeSeriesException.class, "fail to get job")); } @SuppressWarnings("unchecked") @@ -625,22 +631,22 @@ public void testEmptyDetector() { Instant executionStartTime = confirmInitializedSetup(); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.empty()); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); - LockModel lock = new LockModel(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); + LockModel lock = new LockModel(CommonName.JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); runner.runAdJob(jobParameter, lockService, lock, Instant.now().minusSeconds(60), executionStartTime, recorder, detector); verify(client, times(1)).execute(eq(AnomalyResultAction.INSTANCE), any(), any()); verify(adTaskCacheManager, times(1)).hasQueriedResultIndex(anyString()); - verify(nodeStateManager, times(1)).getAnomalyDetector(any(String.class), any(ActionListener.class)); - verify(nodeStateManager, times(0)).getAnomalyDetectorJob(any(String.class), any(ActionListener.class)); + verify(nodeStateManager, times(1)).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); + verify(nodeStateManager, times(0)).getJob(any(String.class), any(ActionListener.class)); verify(adTaskManager, times(1)).updateLatestRealtimeTaskOnCoordinatingNode(any(), any(), any(), any(), any(), any()); assertEquals(1, testAppender.countMessage("Fail to confirm rcf update")); - assertTrue(testAppender.containExceptionMsg(AnomalyDetectionException.class, "fail to get detector")); + assertTrue(testAppender.containExceptionMsg(TimeSeriesException.class, "fail to get detector")); } @SuppressWarnings("unchecked") @@ -648,28 +654,28 @@ public void testEmptyJob() { Instant executionStartTime = confirmInitializedSetup(); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(1); listener.onResponse(Optional.empty()); return null; - }).when(nodeStateManager).getAnomalyDetectorJob(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getJob(any(String.class), any(ActionListener.class)); - LockModel lock = new LockModel(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); + LockModel lock = new LockModel(CommonName.JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); runner.runAdJob(jobParameter, lockService, lock, Instant.now().minusSeconds(60), executionStartTime, recorder, detector); verify(client, times(1)).execute(eq(AnomalyResultAction.INSTANCE), any(), any()); verify(adTaskCacheManager, times(1)).hasQueriedResultIndex(anyString()); - verify(nodeStateManager, times(1)).getAnomalyDetector(any(String.class), any(ActionListener.class)); - verify(nodeStateManager, times(1)).getAnomalyDetectorJob(any(String.class), any(ActionListener.class)); + verify(nodeStateManager, times(1)).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); + verify(nodeStateManager, times(1)).getJob(any(String.class), any(ActionListener.class)); verify(adTaskManager, times(1)).updateLatestRealtimeTaskOnCoordinatingNode(any(), any(), any(), any(), any(), any()); assertEquals(1, testAppender.countMessage("Fail to confirm rcf update")); - assertTrue(testAppender.containExceptionMsg(AnomalyDetectionException.class, "fail to get job")); + assertTrue(testAppender.containExceptionMsg(TimeSeriesException.class, "fail to get job")); } @SuppressWarnings("unchecked") @@ -678,21 +684,21 @@ public void testMarkResultIndexQueried() throws IOException { .newInstance() .setDetectionInterval(new IntervalTimeConfiguration(1, ChronoUnit.MINUTES)) .setCategoryFields(ImmutableList.of(randomAlphaOfLength(5))) - .setResultIndex(CommonName.CUSTOM_RESULT_INDEX_PREFIX + "index") + .setResultIndex(ADCommonName.CUSTOM_RESULT_INDEX_PREFIX + "index") .build(); Instant executionStartTime = confirmInitializedSetup(); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(1); listener.onResponse(Optional.of(TestHelpers.randomAnomalyDetectorJob(true, Instant.ofEpochMilli(1602401500000L), null))); return null; - }).when(nodeStateManager).getAnomalyDetectorJob(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getJob(any(String.class), any(ActionListener.class)); doAnswer(invocation -> { Object[] args = invocation.getArguments(); @@ -712,7 +718,7 @@ public void testMarkResultIndexQueried() throws IOException { Settings settings = Settings .builder() .put(AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE.getKey(), 2) - .put(AnomalyDetectorSettings.MAX_CACHED_DELETED_TASKS.getKey(), 100) + .put(TimeSeriesSettings.MAX_CACHED_DELETED_TASKS.getKey(), 100) .build(); clusterService = mock(ClusterService.class); @@ -721,7 +727,7 @@ public void testMarkResultIndexQueried() throws IOException { Collections .unmodifiableSet( new HashSet<>( - Arrays.asList(AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE, AnomalyDetectorSettings.MAX_CACHED_DELETED_TASKS) + Arrays.asList(AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE, TimeSeriesSettings.MAX_CACHED_DELETED_TASKS) ) ) ); @@ -731,7 +737,7 @@ public void testMarkResultIndexQueried() throws IOException { // init real time task cache for the detector. We will do this during AnomalyResultTransportAction. // Since we mocked the execution by returning anomaly result directly, we need to init it explicitly. - adTaskCacheManager.initRealtimeTaskCache(detector.getDetectorId(), 0); + adTaskCacheManager.initRealtimeTaskCache(detector.getId(), 0); // recreate recorder since we need to use the unmocked adTaskCacheManager recorder = new ExecuteADResultResponseRecorder( @@ -746,21 +752,21 @@ public void testMarkResultIndexQueried() throws IOException { 32 ); - assertEquals(false, adTaskCacheManager.hasQueriedResultIndex(detector.getDetectorId())); + assertEquals(false, adTaskCacheManager.hasQueriedResultIndex(detector.getId())); - LockModel lock = new LockModel(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); + LockModel lock = new LockModel(CommonName.JOB_INDEX, jobParameter.getName(), Instant.now(), 10, false); runner.runAdJob(jobParameter, lockService, lock, Instant.now().minusSeconds(60), executionStartTime, recorder, detector); verify(client, times(1)).execute(eq(AnomalyResultAction.INSTANCE), any(), any()); + verify(nodeStateManager, times(1)).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); + verify(nodeStateManager, times(1)).getJob(any(String.class), any(ActionListener.class)); verify(client, times(1)).search(any(), any()); - verify(nodeStateManager, times(1)).getAnomalyDetector(any(String.class), any(ActionListener.class)); - verify(nodeStateManager, times(1)).getAnomalyDetectorJob(any(String.class), any(ActionListener.class)); ArgumentCaptor totalUpdates = ArgumentCaptor.forClass(Long.class); verify(adTaskManager, times(1)) .updateLatestRealtimeTaskOnCoordinatingNode(any(), any(), totalUpdates.capture(), any(), any(), any()); assertEquals(NUM_MIN_SAMPLES, totalUpdates.getValue().longValue()); - assertEquals(true, adTaskCacheManager.hasQueriedResultIndex(detector.getDetectorId())); + assertEquals(true, adTaskCacheManager.hasQueriedResultIndex(detector.getId())); } } diff --git a/src/test/java/org/opensearch/ad/AnomalyDetectorProfileRunnerTests.java b/src/test/java/org/opensearch/ad/AnomalyDetectorProfileRunnerTests.java index 6ef85f971..cb88bde96 100644 --- a/src/test/java/org/opensearch/ad/AnomalyDetectorProfileRunnerTests.java +++ b/src/test/java/org/opensearch/ad/AnomalyDetectorProfileRunnerTests.java @@ -17,8 +17,6 @@ import static org.mockito.ArgumentMatchers.anyBoolean; import static org.mockito.ArgumentMatchers.anyString; import static org.mockito.Mockito.*; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; import java.io.IOException; import java.time.Instant; @@ -39,26 +37,20 @@ import org.opensearch.action.FailedNodeException; import org.opensearch.action.get.GetRequest; import org.opensearch.action.get.GetResponse; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.DetectorInternalState; import org.opensearch.ad.model.DetectorProfile; import org.opensearch.ad.model.DetectorProfileName; import org.opensearch.ad.model.DetectorState; import org.opensearch.ad.model.InitProgressProfile; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.ad.model.ModelProfileOnNode; import org.opensearch.ad.transport.ProfileAction; import org.opensearch.ad.transport.ProfileNodeResponse; import org.opensearch.ad.transport.ProfileResponse; import org.opensearch.ad.transport.RCFPollingAction; import org.opensearch.ad.transport.RCFPollingResponse; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.common.settings.Settings; @@ -66,6 +58,16 @@ import org.opensearch.core.common.io.stream.NotSerializableExceptionWrapper; import org.opensearch.core.common.transport.TransportAddress; import org.opensearch.index.IndexNotFoundException; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.RemoteTransportException; public class AnomalyDetectorProfileRunnerTests extends AbstractProfileRunnerTests { @@ -100,10 +102,10 @@ private void setUpClientGet( detector = TestHelpers.randomAnomalyDetectorWithInterval(new IntervalTimeConfiguration(detectorIntervalMin, ChronoUnit.MINUTES)); NodeStateManager nodeStateManager = mock(NodeStateManager.class); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(nodeStateManager).getAnomalyDetector(anyString(), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(anyString(), eq(AnalysisType.AD), any(ActionListener.class)); clientUtil = new SecurityClientUtil(nodeStateManager, Settings.EMPTY); runner = new AnomalyDetectorProfileRunner( client, @@ -120,16 +122,13 @@ private void setUpClientGet( GetRequest request = (GetRequest) args[0]; ActionListener listener = (ActionListener) args[1]; - if (request.index().equals(ANOMALY_DETECTORS_INDEX)) { + if (request.index().equals(CommonName.CONFIG_INDEX)) { switch (detectorStatus) { case EXIST: - listener - .onResponse( - TestHelpers.createGetResponse(detector, detector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX) - ); + listener.onResponse(TestHelpers.createGetResponse(detector, detector.getId(), CommonName.CONFIG_INDEX)); break; case INDEX_NOT_EXIST: - listener.onFailure(new IndexNotFoundException(ANOMALY_DETECTORS_INDEX)); + listener.onFailure(new IndexNotFoundException(CommonName.CONFIG_INDEX)); break; case NO_DOC: when(detectorGetReponse.isExists()).thenReturn(false); @@ -139,25 +138,19 @@ private void setUpClientGet( assertTrue("should not reach here", false); break; } - } else if (request.index().equals(ANOMALY_DETECTOR_JOB_INDEX)) { - AnomalyDetectorJob job = null; + } else if (request.index().equals(CommonName.JOB_INDEX)) { + Job job = null; switch (jobStatus) { case INDEX_NOT_EXIT: - listener.onFailure(new IndexNotFoundException(ANOMALY_DETECTOR_JOB_INDEX)); + listener.onFailure(new IndexNotFoundException(CommonName.JOB_INDEX)); break; case DISABLED: job = TestHelpers.randomAnomalyDetectorJob(false, jobEnabledTime, null); - listener - .onResponse( - TestHelpers.createGetResponse(job, detector.getDetectorId(), AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX) - ); + listener.onResponse(TestHelpers.createGetResponse(job, detector.getId(), CommonName.JOB_INDEX)); break; case ENABLED: job = TestHelpers.randomAnomalyDetectorJob(true, jobEnabledTime, null); - listener - .onResponse( - TestHelpers.createGetResponse(job, detector.getDetectorId(), AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX) - ); + listener.onResponse(TestHelpers.createGetResponse(job, detector.getId(), CommonName.JOB_INDEX)); break; default: assertTrue("should not reach here", false); @@ -165,7 +158,7 @@ private void setUpClientGet( } } else { if (errorResultStatus == ErrorResultStatus.INDEX_NOT_EXIT) { - listener.onFailure(new IndexNotFoundException(CommonName.DETECTION_STATE_INDEX)); + listener.onFailure(new IndexNotFoundException(ADCommonName.DETECTION_STATE_INDEX)); return null; } DetectorInternalState.Builder result = new DetectorInternalState.Builder().lastUpdateTime(Instant.now()); @@ -174,8 +167,7 @@ private void setUpClientGet( if (error != null) { result.error(error); } - listener - .onResponse(TestHelpers.createGetResponse(result.build(), detector.getDetectorId(), CommonName.DETECTION_STATE_INDEX)); + listener.onResponse(TestHelpers.createGetResponse(result.build(), detector.getId(), ADCommonName.DETECTION_STATE_INDEX)); } @@ -208,7 +200,7 @@ public void testDetectorNotExist() throws IOException, InterruptedException { assertTrue("Should not reach here", false); inProgressLatch.countDown(); }, exception -> { - assertTrue(exception.getMessage().contains(CommonErrorMessages.FAIL_TO_FIND_DETECTOR_MSG)); + assertTrue(exception.getMessage().contains(CommonMessages.FAIL_TO_FIND_CONFIG_MSG)); inProgressLatch.countDown(); }), stateNError); assertTrue(inProgressLatch.await(100, TimeUnit.SECONDS)); @@ -219,7 +211,7 @@ public void testDisabledJobIndexTemplate(JobStatus status) throws IOException, I DetectorProfile expectedProfile = new DetectorProfile.Builder().state(DetectorState.DISABLED).build(); final CountDownLatch inProgressLatch = new CountDownLatch(1); - runner.profile(detector.getDetectorId(), ActionListener.wrap(response -> { + runner.profile(detector.getId(), ActionListener.wrap(response -> { assertEquals(expectedProfile, response); inProgressLatch.countDown(); }, exception -> { @@ -243,7 +235,7 @@ public void testInitOrRunningStateTemplate(RCFPollingStatus status, DetectorStat DetectorProfile expectedProfile = new DetectorProfile.Builder().state(expectedState).build(); final CountDownLatch inProgressLatch = new CountDownLatch(1); - runner.profile(detector.getDetectorId(), ActionListener.wrap(response -> { + runner.profile(detector.getId(), ActionListener.wrap(response -> { assertEquals(expectedProfile, response); inProgressLatch.countDown(); }, exception -> { @@ -313,7 +305,7 @@ public void testErrorStateTemplate( DetectorProfile expectedProfile = builder.build(); final CountDownLatch inProgressLatch = new CountDownLatch(1); - runner.profile(detector.getDetectorId(), ActionListener.wrap(response -> { + runner.profile(detector.getId(), ActionListener.wrap(response -> { assertEquals(expectedProfile, response); inProgressLatch.countDown(); }, exception -> { @@ -479,13 +471,13 @@ private void setUpClientExecuteRCFPollingAction(RCFPollingStatus inittedEverResu break; case INDEX_NOT_FOUND: case REMOTE_INDEX_NOT_FOUND: - cause = new IndexNotFoundException(detectorId, CommonName.CHECKPOINT_INDEX_NAME); + cause = new IndexNotFoundException(detectorId, ADCommonName.CHECKPOINT_INDEX_NAME); break; default: assertTrue("should not reach here", false); break; } - cause = new AnomalyDetectionException(detectorId, cause); + cause = new TimeSeriesException(detectorId, cause); if (inittedEverResultStatus == RCFPollingStatus.REMOTE_INIT_NOT_EXIT || inittedEverResultStatus == RCFPollingStatus.REMOTE_INDEX_NOT_FOUND) { cause = new RemoteTransportException(RCFPollingAction.NAME, new NotSerializableExceptionWrapper(cause)); @@ -524,7 +516,7 @@ public void testProfileModels() throws InterruptedException, IOException { final CountDownLatch inProgressLatch = new CountDownLatch(1); - runner.profile(detector.getDetectorId(), ActionListener.wrap(profileResponse -> { + runner.profile(detector.getId(), ActionListener.wrap(profileResponse -> { assertEquals(node1, profileResponse.getCoordinatingNode()); assertEquals(shingleSize, profileResponse.getShingleSize()); assertEquals(modelSize * 2, profileResponse.getTotalSizeInBytes()); @@ -556,7 +548,7 @@ public void testInitProgress() throws IOException, InterruptedException { expectedProfile.setInitProgress(profile); final CountDownLatch inProgressLatch = new CountDownLatch(1); - runner.profile(detector.getDetectorId(), ActionListener.wrap(response -> { + runner.profile(detector.getId(), ActionListener.wrap(response -> { assertEquals(expectedProfile, response); inProgressLatch.countDown(); }, exception -> { @@ -575,11 +567,11 @@ public void testInitProgressFailImmediately() throws IOException, InterruptedExc expectedProfile.setInitProgress(profile); final CountDownLatch inProgressLatch = new CountDownLatch(1); - runner.profile(detector.getDetectorId(), ActionListener.wrap(response -> { + runner.profile(detector.getId(), ActionListener.wrap(response -> { assertTrue("Should not reach here ", false); inProgressLatch.countDown(); }, exception -> { - assertTrue(exception.getMessage().contains(CommonErrorMessages.FAIL_TO_FIND_DETECTOR_MSG)); + assertTrue(exception.getMessage().contains(CommonMessages.FAIL_TO_FIND_CONFIG_MSG)); inProgressLatch.countDown(); }), stateInitProgress); assertTrue(inProgressLatch.await(100, TimeUnit.SECONDS)); @@ -593,7 +585,7 @@ public void testInitNoUpdateNoIndex() throws IOException, InterruptedException { .build(); final CountDownLatch inProgressLatch = new CountDownLatch(1); - runner.profile(detector.getDetectorId(), ActionListener.wrap(response -> { + runner.profile(detector.getId(), ActionListener.wrap(response -> { assertEquals(expectedProfile, response); inProgressLatch.countDown(); }, exception -> { @@ -615,7 +607,7 @@ public void testInitNoIndex() throws IOException, InterruptedException { .build(); final CountDownLatch inProgressLatch = new CountDownLatch(1); - runner.profile(detector.getDetectorId(), ActionListener.wrap(response -> { + runner.profile(detector.getId(), ActionListener.wrap(response -> { assertEquals(expectedProfile, response); inProgressLatch.countDown(); }, exception -> { @@ -640,7 +632,7 @@ public void testFailRCFPolling() throws IOException, InterruptedException { setUpClientGet(DetectorStatus.EXIST, JobStatus.ENABLED, RCFPollingStatus.EXCEPTION, ErrorResultStatus.NO_ERROR); final CountDownLatch inProgressLatch = new CountDownLatch(1); - runner.profile(detector.getDetectorId(), ActionListener.wrap(response -> { + runner.profile(detector.getId(), ActionListener.wrap(response -> { assertTrue("Should not reach here ", false); inProgressLatch.countDown(); }, exception -> { diff --git a/src/test/java/org/opensearch/ad/AnomalyDetectorRestTestCase.java b/src/test/java/org/opensearch/ad/AnomalyDetectorRestTestCase.java index 5ca03fb4e..047ee5612 100644 --- a/src/test/java/org/opensearch/ad/AnomalyDetectorRestTestCase.java +++ b/src/test/java/org/opensearch/ad/AnomalyDetectorRestTestCase.java @@ -26,9 +26,6 @@ import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyDetectorExecutionInput; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.model.DetectionDateRange; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Request; import org.opensearch.client.Response; import org.opensearch.client.RestClient; @@ -44,6 +41,10 @@ import org.opensearch.core.xcontent.XContentParser; import org.opensearch.core.xcontent.XContentParserUtils; import org.opensearch.test.rest.OpenSearchRestTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.util.RestHandlerUtils; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -122,9 +123,9 @@ protected AnomalyDetector createRandomAnomalyDetector( AnomalyDetector createdDetector = createAnomalyDetector(detector, refresh, client); if (withMetadata) { - return getAnomalyDetector(createdDetector.getDetectorId(), new BasicHeader(HttpHeaders.USER_AGENT, "Kibana"), client); + return getConfig(createdDetector.getId(), new BasicHeader(HttpHeaders.USER_AGENT, "Kibana"), client); } - return getAnomalyDetector(createdDetector.getDetectorId(), new BasicHeader(HttpHeaders.CONTENT_TYPE, "application/json"), client); + return getConfig(createdDetector.getId(), new BasicHeader(HttpHeaders.CONTENT_TYPE, "application/json"), client); } protected AnomalyDetector createAnomalyDetector(AnomalyDetector detector, Boolean refresh, RestClient client) throws IOException { @@ -141,7 +142,7 @@ protected AnomalyDetector createAnomalyDetector(AnomalyDetector detector, Boolea do { i++; try { - detectorInIndex = getAnomalyDetector(detectorId, client); + detectorInIndex = getConfig(detectorId, client); assertNotNull(detectorInIndex); break; } catch (Exception e) { @@ -164,7 +165,7 @@ protected AnomalyDetector createAnomalyDetector(AnomalyDetector detector, Boolea return detectorInIndex; } - protected Response startAnomalyDetector(String detectorId, DetectionDateRange dateRange, RestClient client) throws IOException { + protected Response startAnomalyDetector(String detectorId, DateRange dateRange, RestClient client) throws IOException { return TestHelpers .makeRequest( client, @@ -206,8 +207,8 @@ protected Response previewAnomalyDetector(String detectorId, RestClient client, ); } - public AnomalyDetector getAnomalyDetector(String detectorId, RestClient client) throws IOException { - return (AnomalyDetector) getAnomalyDetector(detectorId, false, client)[0]; + public AnomalyDetector getConfig(String detectorId, RestClient client) throws IOException { + return (AnomalyDetector) getConfig(detectorId, false, client)[0]; } public Response updateAnomalyDetector(String detectorId, AnomalyDetector newDetector, RestClient client) throws IOException { @@ -223,22 +224,17 @@ public Response updateAnomalyDetector(String detectorId, AnomalyDetector newDete ); } - public AnomalyDetector getAnomalyDetector(String detectorId, BasicHeader header, RestClient client) throws IOException { - return (AnomalyDetector) getAnomalyDetector(detectorId, header, false, false, client)[0]; + public AnomalyDetector getConfig(String detectorId, BasicHeader header, RestClient client) throws IOException { + return (AnomalyDetector) getConfig(detectorId, header, false, false, client)[0]; } - public ToXContentObject[] getAnomalyDetector(String detectorId, boolean returnJob, RestClient client) throws IOException { + public ToXContentObject[] getConfig(String detectorId, boolean returnJob, RestClient client) throws IOException { BasicHeader header = new BasicHeader(HttpHeaders.CONTENT_TYPE, "application/json"); - return getAnomalyDetector(detectorId, header, returnJob, false, client); + return getConfig(detectorId, header, returnJob, false, client); } - public ToXContentObject[] getAnomalyDetector( - String detectorId, - BasicHeader header, - boolean returnJob, - boolean returnTask, - RestClient client - ) throws IOException { + public ToXContentObject[] getConfig(String detectorId, BasicHeader header, boolean returnJob, boolean returnTask, RestClient client) + throws IOException { Response response = TestHelpers .makeRequest( client, @@ -256,7 +252,7 @@ public ToXContentObject[] getAnomalyDetector( String id = null; Long version = null; AnomalyDetector detector = null; - AnomalyDetectorJob detectorJob = null; + Job detectorJob = null; ADTask realtimeAdTask = null; ADTask historicalAdTask = null; while (parser.nextToken() != XContentParser.Token.END_OBJECT) { @@ -273,7 +269,7 @@ public ToXContentObject[] getAnomalyDetector( detector = AnomalyDetector.parse(parser); break; case "anomaly_detector_job": - detectorJob = AnomalyDetectorJob.parse(parser); + detectorJob = Job.parse(parser); break; case "realtime_detection_task": if (parser.currentToken() != XContentParser.Token.VALUE_NULL) { @@ -301,7 +297,7 @@ public ToXContentObject[] getAnomalyDetector( detector.getIndices(), detector.getFeatureAttributes(), detector.getFilterQuery(), - detector.getDetectionInterval(), + detector.getInterval(), detector.getWindowDelay(), detector.getShingleSize(), detector.getUiMetadata(), @@ -309,7 +305,8 @@ public ToXContentObject[] getAnomalyDetector( detector.getLastUpdateTime(), null, detector.getUser(), - detector.getResultIndex() + detector.getCustomResultIndex(), + detector.getImputationOption() ), detectorJob, historicalAdTask, @@ -633,15 +630,16 @@ protected AnomalyDetector cloneDetector(AnomalyDetector anomalyDetector, String anomalyDetector.getIndices(), anomalyDetector.getFeatureAttributes(), anomalyDetector.getFilterQuery(), - anomalyDetector.getDetectionInterval(), + anomalyDetector.getInterval(), anomalyDetector.getWindowDelay(), anomalyDetector.getShingleSize(), anomalyDetector.getUiMetadata(), anomalyDetector.getSchemaVersion(), Instant.now(), - anomalyDetector.getCategoryField(), + anomalyDetector.getCategoryFields(), null, - resultIndex + resultIndex, + anomalyDetector.getImputationOption() ); return detector; } diff --git a/src/test/java/org/opensearch/ad/EntityProfileRunnerTests.java b/src/test/java/org/opensearch/ad/EntityProfileRunnerTests.java index ca11fd7c3..d94226baa 100644 --- a/src/test/java/org/opensearch/ad/EntityProfileRunnerTests.java +++ b/src/test/java/org/opensearch/ad/EntityProfileRunnerTests.java @@ -14,8 +14,6 @@ import static java.util.Collections.emptyMap; import static org.mockito.ArgumentMatchers.any; import static org.mockito.Mockito.*; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; import java.io.IOException; import java.time.temporal.ChronoUnit; @@ -33,24 +31,21 @@ import org.opensearch.action.search.SearchResponse; import org.opensearch.action.search.SearchResponseSections; import org.opensearch.action.search.ShardSearchFailure; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.model.EntityProfile; import org.opensearch.ad.model.EntityProfileName; import org.opensearch.ad.model.EntityState; import org.opensearch.ad.model.InitProgressProfile; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.ad.model.ModelProfile; import org.opensearch.ad.model.ModelProfileOnNode; import org.opensearch.ad.transport.EntityProfileAction; import org.opensearch.ad.transport.EntityProfileResponse; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; +import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; +import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.index.IndexNotFoundException; import org.opensearch.search.DocValueFormat; import org.opensearch.search.SearchHit; @@ -58,8 +53,18 @@ import org.opensearch.search.aggregations.InternalAggregations; import org.opensearch.search.aggregations.metrics.InternalMax; import org.opensearch.search.internal.InternalSearchResponse; - -public class EntityProfileRunnerTests extends AbstractADTest { +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.util.SecurityClientUtil; + +public class EntityProfileRunnerTests extends AbstractTimeSeriesTest { private AnomalyDetector detector; private int detectorIntervalMin; private Client client; @@ -71,7 +76,7 @@ public class EntityProfileRunnerTests extends AbstractADTest { private String detectorId; private String entityValue; private int requiredSamples; - private AnomalyDetectorJob job; + private Job job; private int smallUpdates; private String categoryField; @@ -128,10 +133,10 @@ public void setUp() throws Exception { when(client.threadPool()).thenReturn(threadPool); NodeStateManager nodeStateManager = mock(NodeStateManager.class); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); clientUtil = new SecurityClientUtil(nodeStateManager, Settings.EMPTY); runner = new EntityProfileRunner(client, clientUtil, xContentRegistry(), requiredSamples); @@ -142,20 +147,17 @@ public void setUp() throws Exception { ActionListener listener = (ActionListener) args[1]; String indexName = request.index(); - if (indexName.equals(ANOMALY_DETECTORS_INDEX)) { - listener - .onResponse(TestHelpers.createGetResponse(detector, detector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX)); - } else if (indexName.equals(ANOMALY_DETECTOR_JOB_INDEX)) { - listener - .onResponse( - TestHelpers.createGetResponse(job, detector.getDetectorId(), AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX) - ); + if (indexName.equals(CommonName.CONFIG_INDEX)) { + listener.onResponse(TestHelpers.createGetResponse(detector, detector.getId(), CommonName.CONFIG_INDEX)); + } else if (indexName.equals(CommonName.JOB_INDEX)) { + listener.onResponse(TestHelpers.createGetResponse(job, detector.getId(), CommonName.JOB_INDEX)); } return null; }).when(client).get(any(), any()); entity = Entity.createSingleAttributeEntity(categoryField, entityValue); + modelId = entity.getModelId(detectorId).get(); } @SuppressWarnings("unchecked") @@ -166,7 +168,7 @@ private void setUpSearch() { SearchRequest request = (SearchRequest) args[0]; String indexName = request.indices()[0]; ActionListener listener = (ActionListener) args[1]; - if (indexName.equals(CommonName.ANOMALY_RESULT_INDEX_ALIAS)) { + if (indexName.equals(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS)) { InternalMax maxAgg = new InternalMax(CommonName.AGG_NAME_MAX_TIME, latestSampleTimestamp, DocValueFormat.RAW, emptyMap()); InternalAggregations internalAggregations = InternalAggregations.from(Collections.singletonList(maxAgg)); @@ -226,7 +228,7 @@ private void setUpExecuteEntityProfileAction(InittedEverResultStatus initted) { String indexName = request.indices()[0]; ActionListener listener = (ActionListener) args[1]; SearchResponse searchResponse = null; - if (indexName.equals(CommonName.ANOMALY_RESULT_INDEX_ALIAS)) { + if (indexName.equals(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS)) { InternalMax maxAgg = new InternalMax(CommonName.AGG_NAME_MAX_TIME, latestSampleTimestamp, DocValueFormat.RAW, emptyMap()); InternalAggregations internalAggregations = InternalAggregations.from(Collections.singletonList(maxAgg)); @@ -306,7 +308,7 @@ public void testEmptyProfile() throws InterruptedException { assertTrue("Should not reach here", false); inProgressLatch.countDown(); }, exception -> { - assertTrue(exception.getMessage().contains(CommonErrorMessages.EMPTY_PROFILES_COLLECT)); + assertTrue(exception.getMessage().contains(CommonMessages.EMPTY_PROFILES_COLLECT)); inProgressLatch.countDown(); })); assertTrue(inProgressLatch.await(100, TimeUnit.SECONDS)); @@ -315,6 +317,7 @@ public void testEmptyProfile() throws InterruptedException { public void testModel() throws InterruptedException { setUpExecuteEntityProfileAction(InittedEverResultStatus.INITTED); EntityProfile.Builder expectedProfile = new EntityProfile.Builder(); + ModelProfileOnNode modelProfile = new ModelProfileOnNode(nodeId, new ModelProfile(modelId, entity, modelSize)); expectedProfile.modelProfile(modelProfile); final CountDownLatch inProgressLatch = new CountDownLatch(1); @@ -328,6 +331,18 @@ public void testModel() throws InterruptedException { assertTrue(inProgressLatch.await(100, TimeUnit.SECONDS)); } + public void testEmptyModelProfile() throws IOException { + ModelProfile modelProfile = new ModelProfile(modelId, null, modelSize); + BytesStreamOutput output = new BytesStreamOutput(); + modelProfile.writeTo(output); + StreamInput streamInput = output.bytes().streamInput(); + ModelProfile readResponse = new ModelProfile(streamInput); + assertEquals("serialization has the wrong model id", modelId, readResponse.getModelId()); + assertTrue("serialization has null entity", null == readResponse.getEntity()); + assertEquals("serialization has the wrong model size", modelSize, readResponse.getModelSizeInBytes()); + + } + @SuppressWarnings("unchecked") public void testJobIndexNotFound() throws InterruptedException { setUpExecuteEntityProfileAction(InittedEverResultStatus.INITTED); @@ -340,11 +355,10 @@ public void testJobIndexNotFound() throws InterruptedException { ActionListener listener = (ActionListener) args[1]; String indexName = request.index(); - if (indexName.equals(ANOMALY_DETECTORS_INDEX)) { - listener - .onResponse(TestHelpers.createGetResponse(detector, detector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX)); - } else if (indexName.equals(ANOMALY_DETECTOR_JOB_INDEX)) { - listener.onFailure(new IndexNotFoundException(ANOMALY_DETECTOR_JOB_INDEX)); + if (indexName.equals(CommonName.CONFIG_INDEX)) { + listener.onResponse(TestHelpers.createGetResponse(detector, detector.getId(), CommonName.CONFIG_INDEX)); + } else if (indexName.equals(CommonName.JOB_INDEX)) { + listener.onFailure(new IndexNotFoundException(CommonName.JOB_INDEX)); } return null; @@ -373,9 +387,8 @@ public void testNotMultiEntityDetector() throws IOException, InterruptedExceptio ActionListener listener = (ActionListener) args[1]; String indexName = request.index(); - if (indexName.equals(ANOMALY_DETECTORS_INDEX)) { - listener - .onResponse(TestHelpers.createGetResponse(detector, detector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX)); + if (indexName.equals(CommonName.CONFIG_INDEX)) { + listener.onResponse(TestHelpers.createGetResponse(detector, detector.getId(), CommonName.CONFIG_INDEX)); } return null; @@ -401,11 +414,7 @@ public void testInitNInfo() throws InterruptedException { // 1 / 128 rounded to 1% int neededSamples = requiredSamples - smallUpdates; - InitProgressProfile profile = new InitProgressProfile( - "1%", - neededSamples * detector.getDetectorIntervalInSeconds() / 60, - neededSamples - ); + InitProgressProfile profile = new InitProgressProfile("1%", neededSamples * detector.getIntervalInSeconds() / 60, neededSamples); expectedProfile.initProgress(profile); expectedProfile.isActive(isActive); expectedProfile.lastActiveTimestampMs(latestActiveTimestamp); diff --git a/src/test/java/org/opensearch/ad/HistoricalAnalysisIntegTestCase.java b/src/test/java/org/opensearch/ad/HistoricalAnalysisIntegTestCase.java index 058b3e6a1..9b8356081 100644 --- a/src/test/java/org/opensearch/ad/HistoricalAnalysisIntegTestCase.java +++ b/src/test/java/org/opensearch/ad/HistoricalAnalysisIntegTestCase.java @@ -15,9 +15,9 @@ import static org.opensearch.ad.model.ADTask.EXECUTION_START_TIME_FIELD; import static org.opensearch.ad.model.ADTask.IS_LATEST_FIELD; import static org.opensearch.ad.model.ADTask.PARENT_TASK_ID_FIELD; -import static org.opensearch.ad.util.RestHandlerUtils.START_JOB; import static org.opensearch.index.seqno.SequenceNumbers.UNASSIGNED_PRIMARY_TERM; import static org.opensearch.index.seqno.SequenceNumbers.UNASSIGNED_SEQ_NO; +import static org.opensearch.timeseries.util.RestHandlerUtils.START_JOB; import java.io.IOException; import java.time.Instant; @@ -34,18 +34,13 @@ import org.opensearch.action.get.GetResponse; import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.mock.plugin.MockReindexPlugin; import org.opensearch.ad.model.ADTask; -import org.opensearch.ad.model.ADTaskState; import org.opensearch.ad.model.ADTaskType; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.model.DetectionDateRange; -import org.opensearch.ad.model.Feature; import org.opensearch.ad.transport.AnomalyDetectorJobAction; import org.opensearch.ad.transport.AnomalyDetectorJobRequest; -import org.opensearch.ad.transport.AnomalyDetectorJobResponse; import org.opensearch.core.rest.RestStatus; import org.opensearch.index.query.BoolQueryBuilder; import org.opensearch.index.query.TermQueryBuilder; @@ -55,6 +50,13 @@ import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.search.sort.SortOrder; import org.opensearch.test.transport.MockTransportService; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.model.TaskState; +import org.opensearch.timeseries.transport.JobResponse; import com.google.common.collect.ImmutableList; @@ -118,6 +120,7 @@ public void ingestTestData( } } + @Override public Feature maxValueFeature() throws IOException { AggregationBuilder aggregationBuilder = TestHelpers.parseAggregation("{\"test\":{\"max\":{\"field\":\"" + valueField + "\"}}}"); return new Feature(randomAlphaOfLength(5), randomAlphaOfLength(10), true, aggregationBuilder); @@ -127,27 +130,21 @@ public AnomalyDetector randomDetector(List features) throws IOException return TestHelpers.randomDetector(features, testIndex, detectionIntervalInMinutes, timeField); } - public ADTask randomCreatedADTask(String taskId, AnomalyDetector detector, DetectionDateRange detectionDateRange) { - String detectorId = detector == null ? null : detector.getDetectorId(); + public ADTask randomCreatedADTask(String taskId, AnomalyDetector detector, DateRange detectionDateRange) { + String detectorId = detector == null ? null : detector.getId(); return randomCreatedADTask(taskId, detector, detectorId, detectionDateRange); } - public ADTask randomCreatedADTask(String taskId, AnomalyDetector detector, String detectorId, DetectionDateRange detectionDateRange) { - return randomADTask(taskId, detector, detectorId, detectionDateRange, ADTaskState.CREATED); + public ADTask randomCreatedADTask(String taskId, AnomalyDetector detector, String detectorId, DateRange detectionDateRange) { + return randomADTask(taskId, detector, detectorId, detectionDateRange, TaskState.CREATED); } - public ADTask randomADTask( - String taskId, - AnomalyDetector detector, - String detectorId, - DetectionDateRange detectionDateRange, - ADTaskState state - ) { + public ADTask randomADTask(String taskId, AnomalyDetector detector, String detectorId, DateRange detectionDateRange, TaskState state) { ADTask.Builder builder = ADTask .builder() .taskId(taskId) .taskType(ADTaskType.HISTORICAL_SINGLE_ENTITY.name()) - .detectorId(detectorId) + .configId(detectorId) .detectionDateRange(detectionDateRange) .detector(detector) .state(state.name()) @@ -156,12 +153,12 @@ public ADTask randomADTask( .isLatest(true) .startedBy(randomAlphaOfLength(5)) .executionStartTime(Instant.now().minus(randomLongBetween(10, 100), ChronoUnit.MINUTES)); - if (ADTaskState.FINISHED == state) { + if (TaskState.FINISHED == state) { setPropertyForNotRunningTask(builder); - } else if (ADTaskState.FAILED == state) { + } else if (TaskState.FAILED == state) { setPropertyForNotRunningTask(builder); builder.error(randomAlphaOfLength(5)); - } else if (ADTaskState.STOPPED == state) { + } else if (TaskState.STOPPED == state) { setPropertyForNotRunningTask(builder); builder.error(randomAlphaOfLength(5)); builder.stoppedBy(randomAlphaOfLength(5)); @@ -191,7 +188,7 @@ public List searchADTasks(String detectorId, String parentTaskId, Boolea SearchRequest searchRequest = new SearchRequest(); SearchSourceBuilder sourceBuilder = new SearchSourceBuilder(); sourceBuilder.query(query).sort(EXECUTION_START_TIME_FIELD, SortOrder.DESC).trackTotalHits(true).size(size); - searchRequest.source(sourceBuilder).indices(CommonName.DETECTION_STATE_INDEX); + searchRequest.source(sourceBuilder).indices(ADCommonName.DETECTION_STATE_INDEX); SearchResponse searchResponse = client().search(searchRequest).actionGet(); Iterator iterator = searchResponse.getHits().iterator(); @@ -205,25 +202,25 @@ public List searchADTasks(String detectorId, String parentTaskId, Boolea } public ADTask getADTask(String taskId) throws IOException { - ADTask adTask = toADTask(getDoc(CommonName.DETECTION_STATE_INDEX, taskId)); + ADTask adTask = toADTask(getDoc(ADCommonName.DETECTION_STATE_INDEX, taskId)); adTask.setTaskId(taskId); return adTask; } - public AnomalyDetectorJob getADJob(String detectorId) throws IOException { - return toADJob(getDoc(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, detectorId)); + public Job getADJob(String detectorId) throws IOException { + return toADJob(getDoc(CommonName.JOB_INDEX, detectorId)); } public ADTask toADTask(GetResponse doc) throws IOException { return ADTask.parse(TestHelpers.parser(doc.getSourceAsString())); } - public AnomalyDetectorJob toADJob(GetResponse doc) throws IOException { - return AnomalyDetectorJob.parse(TestHelpers.parser(doc.getSourceAsString())); + public Job toADJob(GetResponse doc) throws IOException { + return Job.parse(TestHelpers.parser(doc.getSourceAsString())); } public ADTask startHistoricalAnalysis(Instant startTime, Instant endTime) throws IOException { - DetectionDateRange dateRange = new DetectionDateRange(startTime, endTime); + DateRange dateRange = new DateRange(startTime, endTime); AnomalyDetector detector = TestHelpers .randomDetector(ImmutableList.of(maxValueFeature()), testIndex, detectionIntervalInMinutes, timeField); String detectorId = createDetector(detector); @@ -235,12 +232,12 @@ public ADTask startHistoricalAnalysis(Instant startTime, Instant endTime) throws UNASSIGNED_PRIMARY_TERM, START_JOB ); - AnomalyDetectorJobResponse response = client().execute(AnomalyDetectorJobAction.INSTANCE, request).actionGet(10000); + JobResponse response = client().execute(AnomalyDetectorJobAction.INSTANCE, request).actionGet(10000); return getADTask(response.getId()); } public ADTask startHistoricalAnalysis(String detectorId, Instant startTime, Instant endTime) throws IOException { - DetectionDateRange dateRange = new DetectionDateRange(startTime, endTime); + DateRange dateRange = new DateRange(startTime, endTime); AnomalyDetectorJobRequest request = new AnomalyDetectorJobRequest( detectorId, dateRange, @@ -249,7 +246,7 @@ public ADTask startHistoricalAnalysis(String detectorId, Instant startTime, Inst UNASSIGNED_PRIMARY_TERM, START_JOB ); - AnomalyDetectorJobResponse response = client().execute(AnomalyDetectorJobAction.INSTANCE, request).actionGet(10000); + JobResponse response = client().execute(AnomalyDetectorJobAction.INSTANCE, request).actionGet(10000); return getADTask(response.getId()); } } diff --git a/src/test/java/org/opensearch/ad/HistoricalAnalysisRestTestCase.java b/src/test/java/org/opensearch/ad/HistoricalAnalysisRestTestCase.java index 4a4def114..09727aa7f 100644 --- a/src/test/java/org/opensearch/ad/HistoricalAnalysisRestTestCase.java +++ b/src/test/java/org/opensearch/ad/HistoricalAnalysisRestTestCase.java @@ -29,14 +29,15 @@ import org.opensearch.ad.mock.model.MockSimpleLog; import org.opensearch.ad.model.ADTaskProfile; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.DetectionDateRange; -import org.opensearch.ad.model.Feature; import org.opensearch.client.Response; import org.opensearch.client.RestClient; import org.opensearch.core.rest.RestStatus; import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentParser; import org.opensearch.search.aggregations.AggregationBuilder; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.Feature; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -59,7 +60,7 @@ public void setUp() throws Exception { public ToXContentObject[] getHistoricalAnomalyDetector(String detectorId, boolean returnTask, RestClient client) throws IOException { BasicHeader header = new BasicHeader(HttpHeaders.CONTENT_TYPE, "application/json"); - return getAnomalyDetector(detectorId, header, false, returnTask, client); + return getConfig(detectorId, header, false, returnTask, client); } public ADTaskProfile getADTaskProfile(String detectorId) throws IOException { @@ -216,7 +217,7 @@ protected AnomalyDetector createAnomalyDetector(int categoryFieldSize, String re protected String startHistoricalAnalysis(String detectorId) throws IOException { Instant endTime = Instant.now().truncatedTo(ChronoUnit.SECONDS); Instant startTime = endTime.minus(10, ChronoUnit.DAYS).truncatedTo(ChronoUnit.SECONDS); - DetectionDateRange dateRange = new DetectionDateRange(startTime, endTime); + DateRange dateRange = new DateRange(startTime, endTime); Response startDetectorResponse = startAnomalyDetector(detectorId, dateRange, client()); Map startDetectorResponseMap = responseAsMap(startDetectorResponse); String taskId = (String) startDetectorResponseMap.get("_id"); diff --git a/src/test/java/org/opensearch/ad/MemoryTrackerTests.java b/src/test/java/org/opensearch/ad/MemoryTrackerTests.java index b4933d0f1..631240072 100644 --- a/src/test/java/org/opensearch/ad/MemoryTrackerTests.java +++ b/src/test/java/org/opensearch/ad/MemoryTrackerTests.java @@ -18,8 +18,6 @@ import java.util.Collections; import java.util.HashSet; -import org.opensearch.ad.breaker.ADCircuitBreakerService; -import org.opensearch.ad.common.exception.LimitExceededException; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.cluster.service.ClusterService; @@ -30,6 +28,10 @@ import org.opensearch.monitor.jvm.JvmInfo.Mem; import org.opensearch.monitor.jvm.JvmService; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import com.amazon.randomcutforest.config.Precision; import com.amazon.randomcutforest.config.TransformMethod; @@ -57,7 +59,7 @@ public class MemoryTrackerTests extends OpenSearchTestCase { double modelDesiredSizePercentage; JvmService jvmService; AnomalyDetector detector; - ADCircuitBreakerService circuitBreaker; + CircuitBreakerService circuitBreaker; @Override public void setUp() throws Exception { @@ -85,10 +87,10 @@ public void setUp() throws Exception { clusterService = mock(ClusterService.class); modelMaxPercen = 0.1f; - Settings settings = Settings.builder().put(AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE.getKey(), modelMaxPercen).build(); + Settings settings = Settings.builder().put(AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE.getKey(), modelMaxPercen).build(); ClusterSettings clusterSettings = new ClusterSettings( settings, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE))) ); when(clusterService.getClusterSettings()).thenReturn(clusterSettings); @@ -106,7 +108,7 @@ public void setUp() throws Exception { .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) .shingleSize(shingleSize) .internalShinglingEnabled(true) .transformMethod(TransformMethod.NORMALIZE) @@ -118,20 +120,20 @@ public void setUp() throws Exception { when(detector.getEnabledFeatureIds()).thenReturn(Collections.singletonList("a")); when(detector.getShingleSize()).thenReturn(1); - circuitBreaker = mock(ADCircuitBreakerService.class); + circuitBreaker = mock(CircuitBreakerService.class); when(circuitBreaker.isOpen()).thenReturn(false); } private void setUpBigHeap() { ByteSizeValue value = new ByteSizeValue(largeHeapSize); when(mem.getHeapMax()).thenReturn(value); - tracker = new MemoryTracker(jvmService, modelMaxSizePercentage, modelDesiredSizePercentage, clusterService, circuitBreaker); + tracker = new MemoryTracker(jvmService, modelMaxSizePercentage, clusterService, circuitBreaker); } private void setUpSmallHeap() { ByteSizeValue value = new ByteSizeValue(smallHeapSize); when(mem.getHeapMax()).thenReturn(value); - tracker = new MemoryTracker(jvmService, modelMaxSizePercentage, modelDesiredSizePercentage, clusterService, circuitBreaker); + tracker = new MemoryTracker(jvmService, modelMaxSizePercentage, clusterService, circuitBreaker); } public void testEstimateModelSize() { @@ -151,7 +153,7 @@ public void testEstimateModelSize() { .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) .internalShinglingEnabled(true) // same with dimension for opportunistic memory saving .shingleSize(shingleSize) @@ -173,7 +175,7 @@ public void testEstimateModelSize() { .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.BATCH_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.BATCH_BOUNDING_BOX_CACHE_RATIO) .internalShinglingEnabled(false) // same with dimension for opportunistic memory saving .shingleSize(1) @@ -193,7 +195,7 @@ public void testEstimateModelSize() { .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) .internalShinglingEnabled(true) // same with dimension for opportunistic memory saving .shingleSize(1) @@ -213,7 +215,7 @@ public void testEstimateModelSize() { .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) .internalShinglingEnabled(true) // same with dimension for opportunistic memory saving .shingleSize(2) @@ -233,7 +235,7 @@ public void testEstimateModelSize() { .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) .internalShinglingEnabled(true) // same with dimension for opportunistic memory saving .shingleSize(4) @@ -253,7 +255,7 @@ public void testEstimateModelSize() { .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) .internalShinglingEnabled(true) // same with dimension for opportunistic memory saving .shingleSize(16) @@ -273,7 +275,7 @@ public void testEstimateModelSize() { .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) .internalShinglingEnabled(true) // same with dimension for opportunistic memory saving .shingleSize(32) @@ -293,7 +295,7 @@ public void testEstimateModelSize() { .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) .internalShinglingEnabled(true) // same with dimension for opportunistic memory saving .shingleSize(64) @@ -313,7 +315,7 @@ public void testEstimateModelSize() { .parallelExecutionEnabled(false) .compact(true) .precision(Precision.FLOAT_32) - .boundingBoxCacheFraction(AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) + .boundingBoxCacheFraction(TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) .internalShinglingEnabled(true) // same with dimension for opportunistic memory saving .shingleSize(65) @@ -331,10 +333,10 @@ public void testCanAllocate() { assertTrue(!tracker.canAllocate((long) (largeHeapSize * modelMaxPercen + 10))); long bytesToUse = 100_000; - tracker.consumeMemory(bytesToUse, false, MemoryTracker.Origin.HC_DETECTOR); + tracker.consumeMemory(bytesToUse, false, MemoryTracker.Origin.REAL_TIME_DETECTOR); assertTrue(!tracker.canAllocate((long) (largeHeapSize * modelMaxPercen))); - tracker.releaseMemory(bytesToUse, false, MemoryTracker.Origin.HC_DETECTOR); + tracker.releaseMemory(bytesToUse, false, MemoryTracker.Origin.REAL_TIME_DETECTOR); assertTrue(tracker.canAllocate((long) (largeHeapSize * modelMaxPercen))); } @@ -347,12 +349,11 @@ public void testMemoryToShed() { setUpSmallHeap(); long bytesToUse = 100_000; assertEquals(bytesToUse, tracker.getHeapLimit()); - assertEquals((long) (smallHeapSize * modelDesiredSizePercentage), tracker.getDesiredModelSize()); - tracker.consumeMemory(bytesToUse, false, MemoryTracker.Origin.HC_DETECTOR); - tracker.consumeMemory(bytesToUse, true, MemoryTracker.Origin.HC_DETECTOR); + tracker.consumeMemory(bytesToUse, false, MemoryTracker.Origin.REAL_TIME_DETECTOR); + tracker.consumeMemory(bytesToUse, true, MemoryTracker.Origin.REAL_TIME_DETECTOR); assertEquals(2 * bytesToUse, tracker.getTotalMemoryBytes()); assertEquals(bytesToUse, tracker.memoryToShed()); - assertTrue(!tracker.syncMemoryState(MemoryTracker.Origin.HC_DETECTOR, 2 * bytesToUse, bytesToUse)); + assertTrue(!tracker.syncMemoryState(MemoryTracker.Origin.REAL_TIME_DETECTOR, 2 * bytesToUse, bytesToUse)); } } diff --git a/src/test/java/org/opensearch/ad/MultiEntityProfileRunnerTests.java b/src/test/java/org/opensearch/ad/MultiEntityProfileRunnerTests.java index 1fc5b84d2..05a63e3df 100644 --- a/src/test/java/org/opensearch/ad/MultiEntityProfileRunnerTests.java +++ b/src/test/java/org/opensearch/ad/MultiEntityProfileRunnerTests.java @@ -17,8 +17,6 @@ import static org.mockito.ArgumentMatchers.anyBoolean; import static org.mockito.Mockito.doAnswer; import static org.mockito.Mockito.mock; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; import java.time.Clock; import java.time.Instant; @@ -44,10 +42,9 @@ import org.opensearch.action.get.GetResponse; import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.ad.model.DetectorInternalState; import org.opensearch.ad.model.DetectorProfile; @@ -65,9 +62,16 @@ import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.core.common.transport.TransportAddress; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.TransportService; -public class MultiEntityProfileRunnerTests extends AbstractADTest { +public class MultiEntityProfileRunnerTests extends AbstractTimeSeriesTest { private AnomalyDetectorProfileRunner runner; private Client client; private SecurityClientUtil clientUtil; @@ -90,7 +94,7 @@ public class MultiEntityProfileRunnerTests extends AbstractADTest { private String model0Id; private int shingleSize; - private AnomalyDetectorJob job; + private Job job; private TransportService transportService; private ADTaskManager adTaskManager; @@ -150,17 +154,12 @@ public void setUp() throws Exception { ActionListener listener = (ActionListener) args[1]; String indexName = request.index(); - if (indexName.equals(ANOMALY_DETECTORS_INDEX)) { - listener - .onResponse(TestHelpers.createGetResponse(detector, detector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX)); - } else if (indexName.equals(CommonName.DETECTION_STATE_INDEX)) { - listener - .onResponse(TestHelpers.createGetResponse(result.build(), detector.getDetectorId(), CommonName.DETECTION_STATE_INDEX)); - } else if (indexName.equals(ANOMALY_DETECTOR_JOB_INDEX)) { - listener - .onResponse( - TestHelpers.createGetResponse(job, detector.getDetectorId(), AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX) - ); + if (indexName.equals(CommonName.CONFIG_INDEX)) { + listener.onResponse(TestHelpers.createGetResponse(detector, detector.getId(), CommonName.CONFIG_INDEX)); + } else if (indexName.equals(ADCommonName.DETECTION_STATE_INDEX)) { + listener.onResponse(TestHelpers.createGetResponse(result.build(), detector.getId(), ADCommonName.DETECTION_STATE_INDEX)); + } else if (indexName.equals(CommonName.JOB_INDEX)) { + listener.onResponse(TestHelpers.createGetResponse(job, detector.getId(), CommonName.JOB_INDEX)); } return null; diff --git a/src/test/java/org/opensearch/ad/ODFERestTestCase.java b/src/test/java/org/opensearch/ad/ODFERestTestCase.java index 52be8a214..f57c64930 100644 --- a/src/test/java/org/opensearch/ad/ODFERestTestCase.java +++ b/src/test/java/org/opensearch/ad/ODFERestTestCase.java @@ -35,7 +35,6 @@ import org.apache.http.HttpHost; import org.apache.http.auth.AuthScope; import org.apache.http.auth.UsernamePasswordCredentials; -import org.apache.http.client.CredentialsProvider; import org.apache.http.conn.ssl.NoopHostnameVerifier; import org.apache.http.impl.client.BasicCredentialsProvider; import org.apache.http.message.BasicHeader; @@ -188,8 +187,9 @@ protected static void configureHttpsClient(RestClientBuilder builder, Settings s String password = Optional .ofNullable(System.getProperty("password")) .orElseThrow(() -> new RuntimeException("password is missing")); - CredentialsProvider credentialsProvider = new BasicCredentialsProvider(); - credentialsProvider.setCredentials(AuthScope.ANY, new UsernamePasswordCredentials(userName, password)); + BasicCredentialsProvider credentialsProvider = new BasicCredentialsProvider(); + final AuthScope anyScope = new AuthScope(null, -1); + credentialsProvider.setCredentials(anyScope, new UsernamePasswordCredentials(userName, password)); try { return httpClientBuilder .setDefaultCredentialsProvider(credentialsProvider) diff --git a/src/test/java/org/opensearch/ad/breaker/ADCircuitBreakerServiceTests.java b/src/test/java/org/opensearch/ad/breaker/ADCircuitBreakerServiceTests.java index 7a5be47b6..df2fa513d 100644 --- a/src/test/java/org/opensearch/ad/breaker/ADCircuitBreakerServiceTests.java +++ b/src/test/java/org/opensearch/ad/breaker/ADCircuitBreakerServiceTests.java @@ -25,11 +25,15 @@ import org.mockito.MockitoAnnotations; import org.opensearch.monitor.jvm.JvmService; import org.opensearch.monitor.jvm.JvmStats; +import org.opensearch.timeseries.breaker.BreakerName; +import org.opensearch.timeseries.breaker.CircuitBreaker; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.breaker.MemoryCircuitBreaker; public class ADCircuitBreakerServiceTests { @InjectMocks - private ADCircuitBreakerService adCircuitBreakerService; + private CircuitBreakerService adCircuitBreakerService; @Mock JvmService jvmService; diff --git a/src/test/java/org/opensearch/ad/breaker/MemoryCircuitBreakerTests.java b/src/test/java/org/opensearch/ad/breaker/MemoryCircuitBreakerTests.java index e9249df82..6264808cc 100644 --- a/src/test/java/org/opensearch/ad/breaker/MemoryCircuitBreakerTests.java +++ b/src/test/java/org/opensearch/ad/breaker/MemoryCircuitBreakerTests.java @@ -21,6 +21,8 @@ import org.mockito.MockitoAnnotations; import org.opensearch.monitor.jvm.JvmService; import org.opensearch.monitor.jvm.JvmStats; +import org.opensearch.timeseries.breaker.CircuitBreaker; +import org.opensearch.timeseries.breaker.MemoryCircuitBreaker; public class MemoryCircuitBreakerTests { diff --git a/src/test/java/org/opensearch/ad/bwc/ADBackwardsCompatibilityIT.java b/src/test/java/org/opensearch/ad/bwc/ADBackwardsCompatibilityIT.java index cfe2cfe9d..c60118b88 100644 --- a/src/test/java/org/opensearch/ad/bwc/ADBackwardsCompatibilityIT.java +++ b/src/test/java/org/opensearch/ad/bwc/ADBackwardsCompatibilityIT.java @@ -8,9 +8,6 @@ package org.opensearch.ad.bwc; -import static org.opensearch.ad.rest.ADRestTestUtils.DetectorType.MULTI_CATEGORY_HC_DETECTOR; -import static org.opensearch.ad.rest.ADRestTestUtils.DetectorType.SINGLE_CATEGORY_HC_DETECTOR; -import static org.opensearch.ad.rest.ADRestTestUtils.DetectorType.SINGLE_ENTITY_DETECTOR; import static org.opensearch.ad.rest.ADRestTestUtils.countADResultOfDetector; import static org.opensearch.ad.rest.ADRestTestUtils.countDetectors; import static org.opensearch.ad.rest.ADRestTestUtils.createAnomalyDetector; @@ -24,9 +21,12 @@ import static org.opensearch.ad.rest.ADRestTestUtils.stopHistoricalAnalysis; import static org.opensearch.ad.rest.ADRestTestUtils.stopRealtimeJob; import static org.opensearch.ad.rest.ADRestTestUtils.waitUntilTaskDone; -import static org.opensearch.ad.util.RestHandlerUtils.ANOMALY_DETECTOR_JOB; -import static org.opensearch.ad.util.RestHandlerUtils.HISTORICAL_ANALYSIS_TASK; -import static org.opensearch.ad.util.RestHandlerUtils.REALTIME_TASK; +import static org.opensearch.ad.rest.ADRestTestUtils.DetectorType.MULTI_CATEGORY_HC_DETECTOR; +import static org.opensearch.ad.rest.ADRestTestUtils.DetectorType.SINGLE_CATEGORY_HC_DETECTOR; +import static org.opensearch.ad.rest.ADRestTestUtils.DetectorType.SINGLE_ENTITY_DETECTOR; +import static org.opensearch.timeseries.util.RestHandlerUtils.ANOMALY_DETECTOR_JOB; +import static org.opensearch.timeseries.util.RestHandlerUtils.HISTORICAL_ANALYSIS_TASK; +import static org.opensearch.timeseries.util.RestHandlerUtils.REALTIME_TASK; import java.io.IOException; import java.util.ArrayList; @@ -39,19 +39,19 @@ import org.apache.http.HttpEntity; import org.junit.Assert; import org.junit.Before; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.mock.model.MockSimpleLog; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.ADTaskType; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.rest.ADRestTestUtils; -import org.opensearch.ad.util.ExceptionUtil; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Response; import org.opensearch.common.settings.Settings; import org.opensearch.core.rest.RestStatus; import org.opensearch.test.rest.OpenSearchRestTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.util.ExceptionUtil; +import org.opensearch.timeseries.util.RestHandlerUtils; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -70,6 +70,7 @@ public class ADBackwardsCompatibilityIT extends OpenSearchRestTestCase { private List runningRealtimeDetectors; private List historicalDetectors; + @Override @Before public void setUp() throws Exception { super.setUp(); @@ -188,7 +189,7 @@ public void testBackwardsCompatibility() throws Exception { case UPGRADED: // This branch is for testing full upgraded cluster. That means all nodes in cluster are running // latest AD version. - Assert.assertTrue(pluginNames.contains("opensearch-anomaly-detection")); + Assert.assertTrue(pluginNames.contains("opensearch-time-series-analytics")); Assert.assertTrue(pluginNames.contains("opensearch-job-scheduler")); Map detectors = new HashMap<>(); @@ -258,7 +259,7 @@ private void verifyAdTasks() throws InterruptedException, IOException { i++; for (String detectorId : runningRealtimeDetectors) { Map jobAndTask = getDetectorWithJobAndTask(client(), detectorId); - AnomalyDetectorJob job = (AnomalyDetectorJob) jobAndTask.get(ANOMALY_DETECTOR_JOB); + Job job = (Job) jobAndTask.get(ANOMALY_DETECTOR_JOB); ADTask historicalTask = (ADTask) jobAndTask.get(HISTORICAL_ANALYSIS_TASK); ADTask realtimeTask = (ADTask) jobAndTask.get(REALTIME_TASK); assertTrue(job.isEnabled()); @@ -291,7 +292,7 @@ private void stopAndDeleteDetectors() throws Exception { } } Map jobAndTask = getDetectorWithJobAndTask(client(), detectorId); - AnomalyDetectorJob job = (AnomalyDetectorJob) jobAndTask.get(ANOMALY_DETECTOR_JOB); + Job job = (Job) jobAndTask.get(ANOMALY_DETECTOR_JOB); ADTask historicalAdTask = (ADTask) jobAndTask.get(HISTORICAL_ANALYSIS_TASK); if (!job.isEnabled() && historicalAdTask.isDone()) { Response deleteDetectorResponse = deleteDetector(client(), detectorId); @@ -320,7 +321,7 @@ private void startRealtimeJobForHistoricalDetectorOnNewNode() throws IOException String jobId = startAnomalyDetectorDirectly(client(), detectorId); assertEquals(detectorId, jobId); Map jobAndTask = getDetectorWithJobAndTask(client(), detectorId); - AnomalyDetectorJob detectorJob = (AnomalyDetectorJob) jobAndTask.get(ANOMALY_DETECTOR_JOB); + Job detectorJob = (Job) jobAndTask.get(ANOMALY_DETECTOR_JOB); assertTrue(detectorJob.isEnabled()); runningRealtimeDetectors.add(detectorId); } @@ -329,7 +330,7 @@ private void startRealtimeJobForHistoricalDetectorOnNewNode() throws IOException private void verifyAllRealtimeJobsRunning() throws IOException { for (String detectorId : runningRealtimeDetectors) { Map jobAndTask = getDetectorWithJobAndTask(client(), detectorId); - AnomalyDetectorJob detectorJob = (AnomalyDetectorJob) jobAndTask.get(ANOMALY_DETECTOR_JOB); + Job detectorJob = (Job) jobAndTask.get(ANOMALY_DETECTOR_JOB); assertTrue(detectorJob.isEnabled()); } } @@ -452,7 +453,7 @@ private List startAnomalyDetector(Response response, boolean historicalD if (!historicalDetector) { Map jobAndTask = getDetectorWithJobAndTask(client(), detectorId); - AnomalyDetectorJob job = (AnomalyDetectorJob) jobAndTask.get(ANOMALY_DETECTOR_JOB); + Job job = (Job) jobAndTask.get(ANOMALY_DETECTOR_JOB); assertTrue(job.isEnabled()); runningRealtimeDetectors.add(detectorId); } else { diff --git a/src/test/java/org/opensearch/ad/caching/AbstractCacheTest.java b/src/test/java/org/opensearch/ad/caching/AbstractCacheTest.java index 717d62379..2c990682a 100644 --- a/src/test/java/org/opensearch/ad/caching/AbstractCacheTest.java +++ b/src/test/java/org/opensearch/ad/caching/AbstractCacheTest.java @@ -22,18 +22,18 @@ import java.util.Random; import org.junit.Before; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.MemoryTracker; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelManager.ModelType; import org.opensearch.ad.ml.ModelState; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.ratelimit.CheckpointMaintainWorker; import org.opensearch.ad.ratelimit.CheckpointWriteWorker; -import org.opensearch.ad.settings.AnomalyDetectorSettings; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.settings.TimeSeriesSettings; -public class AbstractCacheTest extends AbstractADTest { +public class AbstractCacheTest extends AbstractTimeSeriesTest { protected String modelId1, modelId2, modelId3, modelId4; protected Entity entity1, entity2, entity3, entity4; protected ModelState modelState1, modelState2, modelState3, modelState4; @@ -56,10 +56,10 @@ public void setUp() throws Exception { super.setUp(); detector = mock(AnomalyDetector.class); detectorId = "123"; - when(detector.getDetectorId()).thenReturn(detectorId); + when(detector.getId()).thenReturn(detectorId); detectorDuration = Duration.ofMinutes(5); - when(detector.getDetectionIntervalDuration()).thenReturn(detectorDuration); - when(detector.getDetectorIntervalInSeconds()).thenReturn(detectorDuration.getSeconds()); + when(detector.getIntervalDuration()).thenReturn(detectorDuration); + when(detector.getIntervalInSeconds()).thenReturn(detectorDuration.getSeconds()); when(detector.getEnabledFeatureIds()).thenReturn(new ArrayList() { { add("a"); @@ -94,7 +94,7 @@ public void setUp() throws Exception { memoryPerEntity, memoryTracker, clock, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, detectorId, checkpointWriteQueue, checkpointMaintainQueue, diff --git a/src/test/java/org/opensearch/ad/caching/CacheBufferTests.java b/src/test/java/org/opensearch/ad/caching/CacheBufferTests.java index 7332edf4b..265560ab5 100644 --- a/src/test/java/org/opensearch/ad/caching/CacheBufferTests.java +++ b/src/test/java/org/opensearch/ad/caching/CacheBufferTests.java @@ -22,8 +22,8 @@ import java.util.Optional; import org.mockito.ArgumentCaptor; -import org.opensearch.ad.MemoryTracker; import org.opensearch.ad.ratelimit.CheckpointMaintainRequest; +import org.opensearch.timeseries.MemoryTracker; import test.org.opensearch.ad.util.MLUtil; import test.org.opensearch.ad.util.RandomModelStateConfig; @@ -83,7 +83,7 @@ public void testRemovalCandidate2() throws InterruptedException { assertEquals(3 * memoryPerEntity, capturedMemoryReleased.stream().reduce(0L, (a, b) -> a + b).intValue()); assertTrue(capturedreserved.get(0)); assertTrue(!capturedreserved.get(1)); - assertEquals(MemoryTracker.Origin.HC_DETECTOR, capturedOrigin.get(0)); + assertEquals(MemoryTracker.Origin.REAL_TIME_DETECTOR, capturedOrigin.get(0)); assertTrue(!cacheBuffer.expired(Duration.ofHours(1))); } diff --git a/src/test/java/org/opensearch/ad/caching/PriorityCacheTests.java b/src/test/java/org/opensearch/ad/caching/PriorityCacheTests.java index 333f5436d..4154687cf 100644 --- a/src/test/java/org/opensearch/ad/caching/PriorityCacheTests.java +++ b/src/test/java/org/opensearch/ad/caching/PriorityCacheTests.java @@ -44,19 +44,14 @@ import org.apache.logging.log4j.Logger; import org.junit.Before; import org.mockito.ArgumentCaptor; -import org.opensearch.ad.MemoryTracker; -import org.opensearch.ad.breaker.ADCircuitBreakerService; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.LimitExceededException; import org.opensearch.ad.ml.CheckpointDao; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelManager; import org.opensearch.ad.ml.ModelManager.ModelType; import org.opensearch.ad.ml.ModelState; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.settings.EnabledSetting; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.ClusterSettings; import org.opensearch.common.settings.Settings; @@ -66,6 +61,12 @@ import org.opensearch.monitor.jvm.JvmService; import org.opensearch.threadpool.Scheduler.ScheduledCancellable; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.settings.TimeSeriesSettings; public class PriorityCacheTests extends AbstractCacheTest { private static final Logger LOG = LogManager.getLogger(PriorityCacheTests.class); @@ -98,11 +99,11 @@ public void setUp() throws Exception { new HashSet<>( Arrays .asList( - AnomalyDetectorSettings.DEDICATED_CACHE_SIZE, - AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE, - AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE, - AnomalyDetectorSettings.CHECKPOINT_TTL, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ + AnomalyDetectorSettings.AD_DEDICATED_CACHE_SIZE, + AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE, + AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE, + AnomalyDetectorSettings.AD_CHECKPOINT_TTL, + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ ) ) ) @@ -117,19 +118,19 @@ public void setUp() throws Exception { EntityCache cache = new PriorityCache( checkpoint, dedicatedCacheSize, - AnomalyDetectorSettings.CHECKPOINT_TTL, + AnomalyDetectorSettings.AD_CHECKPOINT_TTL, AnomalyDetectorSettings.MAX_INACTIVE_ENTITIES, memoryTracker, - AnomalyDetectorSettings.NUM_TREES, + TimeSeriesSettings.NUM_TREES, clock, clusterService, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, threadPool, checkpointWriteQueue, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, checkpointMaintainQueue, Settings.EMPTY, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ ); CacheProvider cacheProvider = new CacheProvider(); @@ -141,9 +142,9 @@ public void setUp() throws Exception { detector2 = mock(AnomalyDetector.class); detectorId2 = "456"; - when(detector2.getDetectorId()).thenReturn(detectorId2); - when(detector2.getDetectionIntervalDuration()).thenReturn(detectorDuration); - when(detector2.getDetectorIntervalInSeconds()).thenReturn(detectorDuration.getSeconds()); + when(detector2.getId()).thenReturn(detectorId2); + when(detector2.getIntervalDuration()).thenReturn(detectorDuration); + when(detector2.getIntervalInSeconds()).thenReturn(detectorDuration.getSeconds()); point = new double[] { 0.1 }; } @@ -160,31 +161,32 @@ public void testCacheHit() { // ClusterService clusterService = mock(ClusterService.class); float modelMaxPercen = 0.1f; - // Settings settings = Settings.builder().put(AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE.getKey(), modelMaxPercen).build(); + // Settings settings = Settings.builder().put(AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE.getKey(), + // modelMaxPercen).build(); // ClusterSettings clusterSettings = new ClusterSettings( // settings, - // Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE))) + // Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE))) // ); // when(clusterService.getClusterSettings()).thenReturn(clusterSettings); - memoryTracker = spy(new MemoryTracker(jvmService, modelMaxPercen, 0.002, clusterService, mock(ADCircuitBreakerService.class))); + memoryTracker = spy(new MemoryTracker(jvmService, modelMaxPercen, clusterService, mock(CircuitBreakerService.class))); EntityCache cache = new PriorityCache( checkpoint, dedicatedCacheSize, - AnomalyDetectorSettings.CHECKPOINT_TTL, + AnomalyDetectorSettings.AD_CHECKPOINT_TTL, AnomalyDetectorSettings.MAX_INACTIVE_ENTITIES, memoryTracker, - AnomalyDetectorSettings.NUM_TREES, + TimeSeriesSettings.NUM_TREES, clock, clusterService, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, threadPool, checkpointWriteQueue, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, checkpointMaintainQueue, Settings.EMPTY, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ ); CacheProvider cacheProvider = new CacheProvider(); @@ -199,7 +201,7 @@ public void testCacheHit() { assertEquals(1, entityCache.getTotalActiveEntities()); assertEquals(1, entityCache.getAllModels().size()); ModelState hitState = entityCache.get(modelState1.getModelId(), detector); - assertEquals(detectorId, hitState.getDetectorId()); + assertEquals(detectorId, hitState.getId()); EntityModel model = hitState.getModel(); assertEquals(false, model.getTrcf().isPresent()); assertTrue(model.getSamples().isEmpty()); @@ -215,7 +217,7 @@ public void testCacheHit() { verify(memoryTracker, times(1)).consumeMemory(memoryConsumed.capture(), reserved.capture(), origin.capture()); assertEquals(dedicatedCacheSize * expectedMemoryPerEntity, memoryConsumed.getValue().intValue()); assertEquals(true, reserved.getValue().booleanValue()); - assertEquals(MemoryTracker.Origin.HC_DETECTOR, origin.getValue()); + assertEquals(MemoryTracker.Origin.REAL_TIME_DETECTOR, origin.getValue()); // for (int i = 0; i < 2; i++) { // cacheProvider.get(modelId2, detector); @@ -468,7 +470,7 @@ public void testFailedConcurrentMaintenance() throws InterruptedException { new Thread(new FailedCleanRunnable(scheduledThreadCountDown)).start(); entityCache.maintenance(); - } catch (AnomalyDetectionException e) { + } catch (TimeSeriesException e) { scheduledThreadCountDown.countDown(); } @@ -652,9 +654,9 @@ public void testSelectEmpty() { // the next get method public void testLongDetectorInterval() { try { - EnabledSetting.getInstance().setSettingValue(EnabledSetting.DOOR_KEEPER_IN_CACHE_ENABLED, true); + ADEnabledSetting.getInstance().setSettingValue(ADEnabledSetting.DOOR_KEEPER_IN_CACHE_ENABLED, true); when(clock.instant()).thenReturn(Instant.ofEpochSecond(1000)); - when(detector.getDetectionIntervalDuration()).thenReturn(Duration.ofHours(12)); + when(detector.getIntervalDuration()).thenReturn(Duration.ofHours(12)); String modelId = entity1.getModelId(detectorId).get(); // record last access time 1000 assertTrue(null == entityCache.get(modelId, detector)); @@ -669,7 +671,7 @@ public void testLongDetectorInterval() { // * 1000 to convert to milliseconds assertEquals(currentTimeEpoch * 1000, entityCache.getLastActiveMs(detectorId, modelId)); } finally { - EnabledSetting.getInstance().setSettingValue(EnabledSetting.DOOR_KEEPER_IN_CACHE_ENABLED, false); + ADEnabledSetting.getInstance().setSettingValue(ADEnabledSetting.DOOR_KEEPER_IN_CACHE_ENABLED, false); } } diff --git a/src/test/java/org/opensearch/ad/client/AnomalyDetectionNodeClientTests.java b/src/test/java/org/opensearch/ad/client/AnomalyDetectionNodeClientTests.java index df845301f..614bf445a 100644 --- a/src/test/java/org/opensearch/ad/client/AnomalyDetectionNodeClientTests.java +++ b/src/test/java/org/opensearch/ad/client/AnomalyDetectionNodeClientTests.java @@ -12,8 +12,9 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.indices.AnomalyDetectionIndices.ALL_AD_RESULTS_INDEX_PATTERN; +import static org.opensearch.ad.indices.ADIndexManagement.ALL_AD_RESULTS_INDEX_PATTERN; import static org.opensearch.ad.model.AnomalyDetector.DETECTOR_TYPE_FIELD; +import static org.opensearch.timeseries.constant.CommonMessages.FAIL_TO_FIND_CONFIG_MSG; import java.io.IOException; import java.time.Instant; @@ -27,11 +28,9 @@ import org.opensearch.action.search.SearchResponse; import org.opensearch.action.support.PlainActionFuture; import org.opensearch.ad.HistoricalAnalysisIntegTestCase; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.AnomalyDetectorType; import org.opensearch.ad.model.DetectorProfile; import org.opensearch.ad.model.DetectorState; @@ -46,6 +45,9 @@ import org.opensearch.index.query.BoolQueryBuilder; import org.opensearch.index.query.TermQueryBuilder; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Job; import com.google.common.collect.ImmutableList; @@ -68,7 +70,7 @@ public void setup() { @Test public void testSearchAnomalyDetectors_NoIndices() { - deleteIndexIfExists(AnomalyDetector.ANOMALY_DETECTORS_INDEX); + deleteIndexIfExists(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS); SearchResponse searchResponse = adClient.searchAnomalyDetectors(TestHelpers.matchAllRequest()).actionGet(10000); assertEquals(0, searchResponse.getInternalResponse().hits().getTotalHits().value); @@ -76,7 +78,7 @@ public void testSearchAnomalyDetectors_NoIndices() { @Test public void testSearchAnomalyDetectors_Empty() throws IOException { - deleteIndexIfExists(AnomalyDetector.ANOMALY_DETECTORS_INDEX); + deleteIndexIfExists(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS); createDetectorIndex(); SearchResponse searchResponse = adClient.searchAnomalyDetectors(TestHelpers.matchAllRequest()).actionGet(10000); @@ -143,9 +145,9 @@ public void testSearchAnomalyResults_Populated() throws IOException { @Test public void testGetDetectorProfile_NoIndices() throws ExecutionException, InterruptedException { - deleteIndexIfExists(AnomalyDetector.ANOMALY_DETECTORS_INDEX); + deleteIndexIfExists(CommonName.CONFIG_INDEX); deleteIndexIfExists(ALL_AD_RESULTS_INDEX_PATTERN); - deleteIndexIfExists(CommonName.DETECTION_STATE_INDEX); + deleteIndexIfExists(ADCommonName.DETECTION_STATE_INDEX); GetAnomalyDetectorRequest profileRequest = new GetAnomalyDetectorRequest( "foo", @@ -158,8 +160,12 @@ public void testGetDetectorProfile_NoIndices() throws ExecutionException, Interr null ); - expectThrows(OpenSearchStatusException.class, () -> adClient.getDetectorProfile(profileRequest).actionGet(10000)); + OpenSearchStatusException exception = expectThrows( + OpenSearchStatusException.class, + () -> adClient.getDetectorProfile(profileRequest).actionGet(10000) + ); + assertTrue(exception.getMessage().contains(FAIL_TO_FIND_CONFIG_MSG)); verify(clientSpy, times(1)).execute(any(GetAnomalyDetectorAction.class), any(), any()); } @@ -193,7 +199,7 @@ public void testGetDetectorProfile_Populated() throws IOException { 9876, 2345, detector, - mock(AnomalyDetectorJob.class), + mock(Job.class), false, mock(ADTask.class), mock(ADTask.class), diff --git a/src/test/java/org/opensearch/ad/cluster/ADClusterEventListenerTests.java b/src/test/java/org/opensearch/ad/cluster/ADClusterEventListenerTests.java index 6f30ed118..88546e5ce 100644 --- a/src/test/java/org/opensearch/ad/cluster/ADClusterEventListenerTests.java +++ b/src/test/java/org/opensearch/ad/cluster/ADClusterEventListenerTests.java @@ -27,8 +27,7 @@ import org.junit.Before; import org.junit.BeforeClass; import org.opensearch.Version; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.cluster.ClusterChangedEvent; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.ClusterState; @@ -38,8 +37,9 @@ import org.opensearch.cluster.service.ClusterService; import org.opensearch.core.action.ActionListener; import org.opensearch.gateway.GatewayService; +import org.opensearch.timeseries.AbstractTimeSeriesTest; -public class ADClusterEventListenerTests extends AbstractADTest { +public class ADClusterEventListenerTests extends AbstractTimeSeriesTest { private final String clusterManagerNodeId = "clusterManagerNode"; private final String dataNode1Id = "dataNode1"; private final String clusterName = "multi-node-cluster"; @@ -119,7 +119,7 @@ public void testUnchangedClusterState() { public void testIsWarmNode() { HashMap attributesForNode1 = new HashMap<>(); - attributesForNode1.put(CommonName.BOX_TYPE_KEY, CommonName.WARM_BOX_TYPE); + attributesForNode1.put(ADCommonName.BOX_TYPE_KEY, ADCommonName.WARM_BOX_TYPE); dataNode1 = new DiscoveryNode(dataNode1Id, buildNewFakeTransportAddress(), attributesForNode1, BUILT_IN_ROLES, Version.CURRENT); ClusterState warmNodeClusterState = ClusterState diff --git a/src/test/java/org/opensearch/ad/cluster/ADDataMigratorTests.java b/src/test/java/org/opensearch/ad/cluster/ADDataMigratorTests.java index 9b9bb93d9..66fbd3e78 100644 --- a/src/test/java/org/opensearch/ad/cluster/ADDataMigratorTests.java +++ b/src/test/java/org/opensearch/ad/cluster/ADDataMigratorTests.java @@ -21,8 +21,7 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.constant.CommonName.DETECTION_STATE_INDEX; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; +import static org.opensearch.ad.constant.ADCommonName.DETECTION_STATE_INDEX; import org.apache.lucene.search.TotalHits; import org.junit.Before; @@ -34,8 +33,7 @@ import org.opensearch.action.search.SearchResponse; import org.opensearch.action.search.ShardSearchFailure; import org.opensearch.ad.ADUnitTestCase; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.client.Client; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.service.ClusterService; @@ -48,12 +46,14 @@ import org.opensearch.search.SearchHits; import org.opensearch.search.aggregations.InternalAggregations; import org.opensearch.search.internal.InternalSearchResponse; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.constant.CommonName; public class ADDataMigratorTests extends ADUnitTestCase { private Client client; private ClusterService clusterService; private NamedXContentRegistry namedXContentRegistry; - private AnomalyDetectionIndices detectionIndices; + private ADIndexManagement detectionIndices; private ADDataMigrator adDataMigrator; private String detectorId; private String taskId; @@ -69,7 +69,7 @@ public void setUp() throws Exception { client = mock(Client.class); clusterService = mock(ClusterService.class); namedXContentRegistry = TestHelpers.xContentRegistry(); - detectionIndices = mock(AnomalyDetectionIndices.class); + detectionIndices = mock(ADIndexManagement.class); detectorId = randomAlphaOfLength(10); taskId = randomAlphaOfLength(10); detectorContent = "{\"_index\":\".opendistro-anomaly-detectors\",\"_type\":\"_doc\",\"_id\":\"" @@ -104,8 +104,8 @@ public void setUp() throws Exception { } public void testMigrateDataWithNullJobResponse() { - when(detectionIndices.doesAnomalyDetectorJobIndexExist()).thenReturn(true); - when(detectionIndices.doesDetectorStateIndexExist()).thenReturn(true); + when(detectionIndices.doesJobIndexExist()).thenReturn(true); + when(detectionIndices.doesStateIndexExist()).thenReturn(true); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(1); @@ -118,14 +118,14 @@ public void testMigrateDataWithNullJobResponse() { } public void testMigrateDataWithInitingDetectionStateIndexFailure() { - when(detectionIndices.doesAnomalyDetectorJobIndexExist()).thenReturn(true); - when(detectionIndices.doesDetectorStateIndexExist()).thenReturn(false); + when(detectionIndices.doesJobIndexExist()).thenReturn(true); + when(detectionIndices.doesStateIndexExist()).thenReturn(false); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(0); listener.onFailure(new RuntimeException("test")); return null; - }).when(detectionIndices).initDetectionStateIndex(any()); + }).when(detectionIndices).initStateIndex(any()); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(1); @@ -138,14 +138,14 @@ public void testMigrateDataWithInitingDetectionStateIndexFailure() { } public void testMigrateDataWithInitingDetectionStateIndexAlreadyExists() { - when(detectionIndices.doesAnomalyDetectorJobIndexExist()).thenReturn(true); - when(detectionIndices.doesDetectorStateIndexExist()).thenReturn(false); + when(detectionIndices.doesJobIndexExist()).thenReturn(true); + when(detectionIndices.doesStateIndexExist()).thenReturn(false); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(0); listener.onFailure(new ResourceAlreadyExistsException("test")); return null; - }).when(detectionIndices).initDetectionStateIndex(any()); + }).when(detectionIndices).initStateIndex(any()); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(1); @@ -158,14 +158,14 @@ public void testMigrateDataWithInitingDetectionStateIndexAlreadyExists() { } public void testMigrateDataWithInitingDetectionStateIndexNotAcknowledged() { - when(detectionIndices.doesAnomalyDetectorJobIndexExist()).thenReturn(true); - when(detectionIndices.doesDetectorStateIndexExist()).thenReturn(false); + when(detectionIndices.doesJobIndexExist()).thenReturn(true); + when(detectionIndices.doesStateIndexExist()).thenReturn(false); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(0); listener.onResponse(new CreateIndexResponse(false, false, DETECTION_STATE_INDEX)); return null; - }).when(detectionIndices).initDetectionStateIndex(any()); + }).when(detectionIndices).initStateIndex(any()); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(1); @@ -178,14 +178,14 @@ public void testMigrateDataWithInitingDetectionStateIndexNotAcknowledged() { } public void testMigrateDataWithInitingDetectionStateIndexAcknowledged() { - when(detectionIndices.doesAnomalyDetectorJobIndexExist()).thenReturn(true); - when(detectionIndices.doesDetectorStateIndexExist()).thenReturn(false); + when(detectionIndices.doesJobIndexExist()).thenReturn(true); + when(detectionIndices.doesStateIndexExist()).thenReturn(false); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(0); listener.onResponse(new CreateIndexResponse(true, false, DETECTION_STATE_INDEX)); return null; - }).when(detectionIndices).initDetectionStateIndex(any()); + }).when(detectionIndices).initStateIndex(any()); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(1); @@ -198,8 +198,8 @@ public void testMigrateDataWithInitingDetectionStateIndexAcknowledged() { } public void testMigrateDataWithEmptyJobResponse() { - when(detectionIndices.doesAnomalyDetectorJobIndexExist()).thenReturn(true); - when(detectionIndices.doesDetectorStateIndexExist()).thenReturn(true); + when(detectionIndices.doesJobIndexExist()).thenReturn(true); + when(detectionIndices.doesStateIndexExist()).thenReturn(true); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(1); @@ -232,8 +232,8 @@ public void testMigrateDataWithEmptyJobResponse() { } public void testMigrateDataWithNormalJobResponseButMissingDetector() { - when(detectionIndices.doesAnomalyDetectorJobIndexExist()).thenReturn(true); - when(detectionIndices.doesDetectorStateIndexExist()).thenReturn(true); + when(detectionIndices.doesJobIndexExist()).thenReturn(true); + when(detectionIndices.doesStateIndexExist()).thenReturn(true); doAnswer(invocation -> { // Return correct AD job when search job index @@ -282,8 +282,8 @@ public void testMigrateDataWithNormalJobResponseButMissingDetector() { } public void testMigrateDataWithNormalJobResponseAndExistingDetector() { - when(detectionIndices.doesAnomalyDetectorJobIndexExist()).thenReturn(true); - when(detectionIndices.doesDetectorStateIndexExist()).thenReturn(true); + when(detectionIndices.doesJobIndexExist()).thenReturn(true); + when(detectionIndices.doesStateIndexExist()).thenReturn(true); String detectorId = randomAlphaOfLength(10); doAnswer(invocation -> { @@ -349,8 +349,8 @@ public void testMigrateDataWithNormalJobResponseAndExistingDetector() { } public void testMigrateDataWithNormalJobResponse_ExistingDetector_ExistingInternalError() { - when(detectionIndices.doesAnomalyDetectorJobIndexExist()).thenReturn(true); - when(detectionIndices.doesDetectorStateIndexExist()).thenReturn(true); + when(detectionIndices.doesJobIndexExist()).thenReturn(true); + when(detectionIndices.doesStateIndexExist()).thenReturn(true); String detectorId = randomAlphaOfLength(10); doAnswer(invocation -> { @@ -420,7 +420,7 @@ public void testMigrateDataWithNormalJobResponse_ExistingDetector_ExistingIntern public void testMigrateDataTwice() { adDataMigrator.migrateData(); adDataMigrator.migrateData(); - verify(detectionIndices, times(1)).doesAnomalyDetectorJobIndexExist(); + verify(detectionIndices, times(1)).doesJobIndexExist(); } public void testMigrateDataWithNoAvailableShardsException() { @@ -432,8 +432,8 @@ public void testMigrateDataWithNoAvailableShardsException() { ); return null; }).when(client).search(any(), any()); - when(detectionIndices.doesAnomalyDetectorJobIndexExist()).thenReturn(true); - when(detectionIndices.doesDetectorStateIndexExist()).thenReturn(true); + when(detectionIndices.doesJobIndexExist()).thenReturn(true); + when(detectionIndices.doesStateIndexExist()).thenReturn(true); adDataMigrator.migrateData(); assertFalse(adDataMigrator.isMigrated()); @@ -442,11 +442,11 @@ public void testMigrateDataWithNoAvailableShardsException() { public void testMigrateDataWithIndexNotFoundException() { doAnswer(invocation -> { ActionListener listener = invocation.getArgument(1); - listener.onFailure(new IndexNotFoundException(ANOMALY_DETECTOR_JOB_INDEX)); + listener.onFailure(new IndexNotFoundException(CommonName.JOB_INDEX)); return null; }).when(client).search(any(), any()); - when(detectionIndices.doesAnomalyDetectorJobIndexExist()).thenReturn(true); - when(detectionIndices.doesDetectorStateIndexExist()).thenReturn(true); + when(detectionIndices.doesJobIndexExist()).thenReturn(true); + when(detectionIndices.doesStateIndexExist()).thenReturn(true); adDataMigrator.migrateData(); verify(adDataMigrator, never()).backfillRealtimeTask(any(), anyBoolean()); @@ -459,8 +459,8 @@ public void testMigrateDataWithUnknownException() { listener.onFailure(new RuntimeException("test unknown exception")); return null; }).when(client).search(any(), any()); - when(detectionIndices.doesAnomalyDetectorJobIndexExist()).thenReturn(true); - when(detectionIndices.doesDetectorStateIndexExist()).thenReturn(true); + when(detectionIndices.doesJobIndexExist()).thenReturn(true); + when(detectionIndices.doesStateIndexExist()).thenReturn(true); adDataMigrator.migrateData(); verify(adDataMigrator, never()).backfillRealtimeTask(any(), anyBoolean()); diff --git a/src/test/java/org/opensearch/ad/cluster/ADVersionUtilTests.java b/src/test/java/org/opensearch/ad/cluster/ADVersionUtilTests.java index 344d9e466..aa5fcc55b 100644 --- a/src/test/java/org/opensearch/ad/cluster/ADVersionUtilTests.java +++ b/src/test/java/org/opensearch/ad/cluster/ADVersionUtilTests.java @@ -17,11 +17,11 @@ public class ADVersionUtilTests extends ADUnitTestCase { public void testParseVersionFromString() { - Version version = ADVersionUtil.fromString("1.1.0.0"); - assertEquals(Version.V_1_1_0, version); + Version version = ADVersionUtil.fromString("2.1.0.0"); + assertEquals(Version.V_2_1_0, version); - version = ADVersionUtil.fromString("1.1.0"); - assertEquals(Version.V_1_1_0, version); + version = ADVersionUtil.fromString("2.1.0"); + assertEquals(Version.V_2_1_0, version); } public void testParseVersionFromStringWithNull() { @@ -31,9 +31,4 @@ public void testParseVersionFromStringWithNull() { public void testParseVersionFromStringWithWrongFormat() { expectThrows(IllegalArgumentException.class, () -> ADVersionUtil.fromString("1.1")); } - - public void testCompatibleWithVersionOnOrAfter1_1() { - assertTrue(ADVersionUtil.compatibleWithVersionOnOrAfter1_1(Version.V_1_1_0)); - assertFalse(ADVersionUtil.compatibleWithVersionOnOrAfter1_1(Version.V_1_0_0)); - } } diff --git a/src/test/java/org/opensearch/ad/cluster/ClusterManagerEventListenerTests.java b/src/test/java/org/opensearch/ad/cluster/ClusterManagerEventListenerTests.java index 3f8fe1287..9c2e79236 100644 --- a/src/test/java/org/opensearch/ad/cluster/ClusterManagerEventListenerTests.java +++ b/src/test/java/org/opensearch/ad/cluster/ClusterManagerEventListenerTests.java @@ -27,12 +27,9 @@ import java.util.Locale; import org.junit.Before; -import org.opensearch.ad.AbstractADTest; import org.opensearch.ad.cluster.diskcleanup.ModelCheckpointIndexRetention; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.ClientUtil; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.lifecycle.LifecycleListener; @@ -41,8 +38,11 @@ import org.opensearch.common.unit.TimeValue; import org.opensearch.threadpool.Scheduler.Cancellable; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.util.ClientUtil; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; -public class ClusterManagerEventListenerTests extends AbstractADTest { +public class ClusterManagerEventListenerTests extends AbstractTimeSeriesTest { private ClusterService clusterService; private ThreadPool threadPool; private Client client; @@ -60,7 +60,7 @@ public void setUp() throws Exception { clusterService = mock(ClusterService.class); ClusterSettings settings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.CHECKPOINT_TTL))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.AD_CHECKPOINT_TTL))) ); when(clusterService.getClusterSettings()).thenReturn(settings); @@ -75,7 +75,7 @@ public void setUp() throws Exception { clock = mock(Clock.class); clientUtil = mock(ClientUtil.class); HashMap ignoredAttributes = new HashMap(); - ignoredAttributes.put(CommonName.BOX_TYPE_KEY, CommonName.WARM_BOX_TYPE); + ignoredAttributes.put(ADCommonName.BOX_TYPE_KEY, ADCommonName.WARM_BOX_TYPE); nodeFilter = new DiscoveryNodeFilterer(clusterService); clusterManagerService = new ClusterManagerEventListener( @@ -85,7 +85,7 @@ public void setUp() throws Exception { clock, clientUtil, nodeFilter, - AnomalyDetectorSettings.CHECKPOINT_TTL, + AnomalyDetectorSettings.AD_CHECKPOINT_TTL, Settings.EMPTY ); } diff --git a/src/test/java/org/opensearch/ad/cluster/DailyCronTests.java b/src/test/java/org/opensearch/ad/cluster/DailyCronTests.java index 0ec57b77f..a57a4c649 100644 --- a/src/test/java/org/opensearch/ad/cluster/DailyCronTests.java +++ b/src/test/java/org/opensearch/ad/cluster/DailyCronTests.java @@ -23,14 +23,14 @@ import java.util.Locale; import org.opensearch.OpenSearchException; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.util.ClientUtil; import org.opensearch.core.action.ActionListener; import org.opensearch.index.IndexNotFoundException; import org.opensearch.index.reindex.BulkByScrollResponse; import org.opensearch.index.reindex.DeleteByQueryAction; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.util.ClientUtil; -public class DailyCronTests extends AbstractADTest { +public class DailyCronTests extends AbstractTimeSeriesTest { enum DailyCronTestExecutionMode { NORMAL, diff --git a/src/test/java/org/opensearch/ad/cluster/HashRingTests.java b/src/test/java/org/opensearch/ad/cluster/HashRingTests.java index 3f3949558..69bb38a57 100644 --- a/src/test/java/org/opensearch/ad/cluster/HashRingTests.java +++ b/src/test/java/org/opensearch/ad/cluster/HashRingTests.java @@ -19,7 +19,7 @@ import static org.mockito.Mockito.mock; import static org.mockito.Mockito.spy; import static org.mockito.Mockito.when; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.COOLDOWN_MINUTES; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_COOLDOWN_MINUTES; import java.net.UnknownHostException; import java.time.Clock; @@ -36,9 +36,8 @@ import org.opensearch.action.admin.cluster.node.info.NodesInfoResponse; import org.opensearch.action.admin.cluster.node.info.PluginsAndModules; import org.opensearch.ad.ADUnitTestCase; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.ml.ModelManager; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.AdminClient; import org.opensearch.client.Client; import org.opensearch.client.ClusterAdminClient; @@ -51,6 +50,8 @@ import org.opensearch.common.unit.TimeValue; import org.opensearch.core.action.ActionListener; import org.opensearch.plugins.PluginInfo; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -85,10 +86,10 @@ public void setUp() throws Exception { newNodeId = "newNode"; newNode = createNode(newNodeId, "127.0.0.2", 9201, emptyMap()); warmNodeId = "warmNode"; - warmNode = createNode(warmNodeId, "127.0.0.3", 9202, ImmutableMap.of(CommonName.BOX_TYPE_KEY, CommonName.WARM_BOX_TYPE)); + warmNode = createNode(warmNodeId, "127.0.0.3", 9202, ImmutableMap.of(ADCommonName.BOX_TYPE_KEY, ADCommonName.WARM_BOX_TYPE)); - settings = Settings.builder().put(COOLDOWN_MINUTES.getKey(), TimeValue.timeValueSeconds(5)).build(); - ClusterSettings clusterSettings = clusterSetting(settings, COOLDOWN_MINUTES); + settings = Settings.builder().put(AD_COOLDOWN_MINUTES.getKey(), TimeValue.timeValueSeconds(5)).build(); + ClusterSettings clusterSettings = clusterSetting(settings, AD_COOLDOWN_MINUTES); clusterService = spy(new ClusterService(settings, clusterSettings, null)); nodeFilter = spy(new DiscoveryNodeFilterer(clusterService)); @@ -142,10 +143,10 @@ public void testGetOwningNode() throws UnknownHostException { assertEquals( "Wrong hash ring size for historical analysis", 2, - hashRing.getNodesWithSameAdVersion(Version.V_1_1_0, false).size() + hashRing.getNodesWithSameAdVersion(Version.V_2_1_0, false).size() ); // Circles for realtime AD will change as it's eligible to build for when its empty - assertEquals("Wrong hash ring size for realtime AD", 2, hashRing.getNodesWithSameAdVersion(Version.V_1_1_0, true).size()); + assertEquals("Wrong hash ring size for realtime AD", 2, hashRing.getNodesWithSameAdVersion(Version.V_2_1_0, true).size()); }, e -> { logger.error("building hash ring failed", e); assertFalse("Build hash ring failed", true); @@ -161,10 +162,10 @@ public void testGetOwningNode() throws UnknownHostException { assertEquals( "Wrong hash ring size for historical analysis", 3, - hashRing.getNodesWithSameAdVersion(Version.V_1_1_0, false).size() + hashRing.getNodesWithSameAdVersion(Version.V_2_1_0, false).size() ); // Circles for realtime AD will not change as it's eligible to rebuild - assertEquals("Wrong hash ring size for realtime AD", 2, hashRing.getNodesWithSameAdVersion(Version.V_1_1_0, true).size()); + assertEquals("Wrong hash ring size for realtime AD", 2, hashRing.getNodesWithSameAdVersion(Version.V_2_1_0, true).size()); }, e -> { logger.error("building hash ring failed", e); @@ -182,9 +183,9 @@ public void testGetOwningNode() throws UnknownHostException { assertEquals( "Wrong hash ring size for historical analysis", 4, - hashRing.getNodesWithSameAdVersion(Version.V_1_1_0, false).size() + hashRing.getNodesWithSameAdVersion(Version.V_2_1_0, false).size() ); - assertEquals("Wrong hash ring size for realtime AD", 4, hashRing.getNodesWithSameAdVersion(Version.V_1_1_0, true).size()); + assertEquals("Wrong hash ring size for realtime AD", 4, hashRing.getNodesWithSameAdVersion(Version.V_2_1_0, true).size()); }, e -> { logger.error("building hash ring failed", e); assertFalse("Failed to build hash ring", true); @@ -237,7 +238,7 @@ private void setupClusterAdminClient(DiscoveryNode... nodes) { ActionListener listener = invocation.getArgument(1); List nodeInfos = new ArrayList<>(); for (DiscoveryNode node : nodes) { - nodeInfos.add(createNodeInfo(node, "1.1.0.0")); + nodeInfos.add(createNodeInfo(node, "2.1.0.0")); } NodesInfoResponse nodesInfoResponse = new NodesInfoResponse(ClusterName.DEFAULT, nodeInfos, ImmutableList.of()); listener.onResponse(nodesInfoResponse); @@ -250,7 +251,7 @@ private NodeInfo createNodeInfo(DiscoveryNode node, String version) { plugins .add( new PluginInfo( - CommonName.AD_PLUGIN_NAME, + CommonName.TIME_SERIES_PLUGIN_NAME, randomAlphaOfLengthBetween(3, 10), version, Version.CURRENT, diff --git a/src/test/java/org/opensearch/ad/cluster/HourlyCronTests.java b/src/test/java/org/opensearch/ad/cluster/HourlyCronTests.java index b3fedf5dd..2806138d9 100644 --- a/src/test/java/org/opensearch/ad/cluster/HourlyCronTests.java +++ b/src/test/java/org/opensearch/ad/cluster/HourlyCronTests.java @@ -27,12 +27,10 @@ import org.opensearch.OpenSearchException; import org.opensearch.Version; import org.opensearch.action.FailedNodeException; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.transport.CronAction; import org.opensearch.ad.transport.CronNodeResponse; import org.opensearch.ad.transport.CronResponse; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.Client; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.ClusterState; @@ -40,10 +38,12 @@ import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.action.ActionListener; import org.opensearch.core.common.io.stream.StreamInput; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; import test.org.opensearch.ad.util.ClusterCreation; -public class HourlyCronTests extends AbstractADTest { +public class HourlyCronTests extends AbstractTimeSeriesTest { enum HourlyCronTestExecutionMode { NORMAL, @@ -59,7 +59,7 @@ public void templateHourlyCron(HourlyCronTestExecutionMode mode) { ClusterState state = ClusterCreation.state(1); when(clusterService.state()).thenReturn(state); HashMap ignoredAttributes = new HashMap(); - ignoredAttributes.put(CommonName.BOX_TYPE_KEY, CommonName.WARM_BOX_TYPE); + ignoredAttributes.put(ADCommonName.BOX_TYPE_KEY, ADCommonName.WARM_BOX_TYPE); DiscoveryNodeFilterer nodeFilter = new DiscoveryNodeFilterer(clusterService); Client client = mock(Client.class); diff --git a/src/test/java/org/opensearch/ad/cluster/diskcleanup/IndexCleanupTests.java b/src/test/java/org/opensearch/ad/cluster/diskcleanup/IndexCleanupTests.java index 656881eeb..0748fe122 100644 --- a/src/test/java/org/opensearch/ad/cluster/diskcleanup/IndexCleanupTests.java +++ b/src/test/java/org/opensearch/ad/cluster/diskcleanup/IndexCleanupTests.java @@ -28,8 +28,6 @@ import org.opensearch.action.admin.indices.stats.CommonStats; import org.opensearch.action.admin.indices.stats.IndicesStatsResponse; import org.opensearch.action.admin.indices.stats.ShardStats; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.util.ClientUtil; import org.opensearch.client.Client; import org.opensearch.client.IndicesAdminClient; import org.opensearch.cluster.service.ClusterService; @@ -38,8 +36,10 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.index.reindex.DeleteByQueryAction; import org.opensearch.index.store.StoreStats; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.util.ClientUtil; -public class IndexCleanupTests extends AbstractADTest { +public class IndexCleanupTests extends AbstractTimeSeriesTest { @Mock(answer = Answers.RETURNS_DEEP_STUBS) Client client; diff --git a/src/test/java/org/opensearch/ad/cluster/diskcleanup/ModelCheckpointIndexRetentionTests.java b/src/test/java/org/opensearch/ad/cluster/diskcleanup/ModelCheckpointIndexRetentionTests.java index 13c4a7d90..b95757925 100644 --- a/src/test/java/org/opensearch/ad/cluster/diskcleanup/ModelCheckpointIndexRetentionTests.java +++ b/src/test/java/org/opensearch/ad/cluster/diskcleanup/ModelCheckpointIndexRetentionTests.java @@ -26,11 +26,11 @@ import org.junit.Test; import org.mockito.Mock; import org.mockito.MockitoAnnotations; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.action.ActionListener; +import org.opensearch.timeseries.AbstractTimeSeriesTest; -public class ModelCheckpointIndexRetentionTests extends AbstractADTest { +public class ModelCheckpointIndexRetentionTests extends AbstractTimeSeriesTest { Duration defaultCheckpointTtl = Duration.ofDays(3); @@ -70,12 +70,14 @@ public void testRunWithCleanupAsNeeded() throws Exception { ActionListener listener = (ActionListener) args[3]; listener.onResponse(true); return null; - }).when(indexCleanup).deleteDocsBasedOnShardSize(eq(CommonName.CHECKPOINT_INDEX_NAME), eq(50 * 1024 * 1024 * 1024L), any(), any()); + }) + .when(indexCleanup) + .deleteDocsBasedOnShardSize(eq(ADCommonName.CHECKPOINT_INDEX_NAME), eq(50 * 1024 * 1024 * 1024L), any(), any()); modelCheckpointIndexRetention.run(); verify(indexCleanup, times(2)) - .deleteDocsBasedOnShardSize(eq(CommonName.CHECKPOINT_INDEX_NAME), eq(50 * 1024 * 1024 * 1024L), any(), any()); - verify(indexCleanup).deleteDocsByQuery(eq(CommonName.CHECKPOINT_INDEX_NAME), any(), any()); + .deleteDocsBasedOnShardSize(eq(ADCommonName.CHECKPOINT_INDEX_NAME), eq(50 * 1024 * 1024 * 1024L), any(), any()); + verify(indexCleanup).deleteDocsByQuery(eq(ADCommonName.CHECKPOINT_INDEX_NAME), any(), any()); } @SuppressWarnings("unchecked") @@ -86,10 +88,12 @@ public void testRunWithCleanupAsFalse() throws Exception { ActionListener listener = (ActionListener) args[3]; listener.onResponse(false); return null; - }).when(indexCleanup).deleteDocsBasedOnShardSize(eq(CommonName.CHECKPOINT_INDEX_NAME), eq(50 * 1024 * 1024 * 1024L), any(), any()); + }) + .when(indexCleanup) + .deleteDocsBasedOnShardSize(eq(ADCommonName.CHECKPOINT_INDEX_NAME), eq(50 * 1024 * 1024 * 1024L), any(), any()); modelCheckpointIndexRetention.run(); - verify(indexCleanup).deleteDocsBasedOnShardSize(eq(CommonName.CHECKPOINT_INDEX_NAME), eq(50 * 1024 * 1024 * 1024L), any(), any()); - verify(indexCleanup).deleteDocsByQuery(eq(CommonName.CHECKPOINT_INDEX_NAME), any(), any()); + verify(indexCleanup).deleteDocsBasedOnShardSize(eq(ADCommonName.CHECKPOINT_INDEX_NAME), eq(50 * 1024 * 1024 * 1024L), any(), any()); + verify(indexCleanup).deleteDocsByQuery(eq(ADCommonName.CHECKPOINT_INDEX_NAME), any(), any()); } } diff --git a/src/test/java/org/opensearch/ad/common/exception/ADTaskCancelledExceptionTests.java b/src/test/java/org/opensearch/ad/common/exception/ADTaskCancelledExceptionTests.java index 8c7e0cdaa..d66573379 100644 --- a/src/test/java/org/opensearch/ad/common/exception/ADTaskCancelledExceptionTests.java +++ b/src/test/java/org/opensearch/ad/common/exception/ADTaskCancelledExceptionTests.java @@ -12,13 +12,14 @@ package org.opensearch.ad.common.exception; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.common.exception.TaskCancelledException; public class ADTaskCancelledExceptionTests extends OpenSearchTestCase { public void testConstructor() { String message = randomAlphaOfLength(5); String user = randomAlphaOfLength(5); - ADTaskCancelledException exception = new ADTaskCancelledException(message, user); + TaskCancelledException exception = new TaskCancelledException(message, user); assertEquals(message, exception.getMessage()); assertEquals(user, exception.getCancelledBy()); } diff --git a/src/test/java/org/opensearch/ad/common/exception/ADValidationExceptionTests.java b/src/test/java/org/opensearch/ad/common/exception/ADValidationExceptionTests.java deleted file mode 100644 index f5e3bb030..000000000 --- a/src/test/java/org/opensearch/ad/common/exception/ADValidationExceptionTests.java +++ /dev/null @@ -1,43 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.common.exception; - -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.model.DetectorValidationIssueType; -import org.opensearch.ad.model.ValidationAspect; -import org.opensearch.test.OpenSearchTestCase; - -public class ADValidationExceptionTests extends OpenSearchTestCase { - public void testConstructorDetector() { - String message = randomAlphaOfLength(5); - ADValidationException exception = new ADValidationException(message, DetectorValidationIssueType.NAME, ValidationAspect.DETECTOR); - assertEquals(DetectorValidationIssueType.NAME, exception.getType()); - assertEquals(ValidationAspect.DETECTOR, exception.getAspect()); - } - - public void testConstructorModel() { - String message = randomAlphaOfLength(5); - ADValidationException exception = new ADValidationException(message, DetectorValidationIssueType.CATEGORY, ValidationAspect.MODEL); - assertEquals(DetectorValidationIssueType.CATEGORY, exception.getType()); - assertEquals(ValidationAspect.getName(CommonName.MODEL_ASPECT), exception.getAspect()); - } - - public void testToString() { - String message = randomAlphaOfLength(5); - ADValidationException exception = new ADValidationException(message, DetectorValidationIssueType.NAME, ValidationAspect.DETECTOR); - String exceptionString = exception.toString(); - logger.info("exception string: " + exceptionString); - ADValidationException exceptionNoType = new ADValidationException(message, DetectorValidationIssueType.NAME, null); - String exceptionStringNoType = exceptionNoType.toString(); - logger.info("exception string no type: " + exceptionStringNoType); - } -} diff --git a/src/test/java/org/opensearch/ad/common/exception/LimitExceededExceptionTests.java b/src/test/java/org/opensearch/ad/common/exception/LimitExceededExceptionTests.java index 697a19cb8..37b3770ff 100644 --- a/src/test/java/org/opensearch/ad/common/exception/LimitExceededExceptionTests.java +++ b/src/test/java/org/opensearch/ad/common/exception/LimitExceededExceptionTests.java @@ -14,6 +14,7 @@ import static org.junit.Assert.assertEquals; import org.junit.Test; +import org.opensearch.timeseries.common.exception.LimitExceededException; public class LimitExceededExceptionTests { @@ -22,7 +23,7 @@ public void testConstructorWithIdAndExplanation() { String id = "test id"; String message = "test message"; LimitExceededException limitExceeded = new LimitExceededException(id, message); - assertEquals(id, limitExceeded.getAnomalyDetectorId()); + assertEquals(id, limitExceeded.getConfigId()); assertEquals(message, limitExceeded.getMessage()); } } diff --git a/src/test/java/org/opensearch/ad/common/exception/NotSerializedADExceptionNameTests.java b/src/test/java/org/opensearch/ad/common/exception/NotSerializedADExceptionNameTests.java index 10ddfc64c..0e544b409 100644 --- a/src/test/java/org/opensearch/ad/common/exception/NotSerializedADExceptionNameTests.java +++ b/src/test/java/org/opensearch/ad/common/exception/NotSerializedADExceptionNameTests.java @@ -15,53 +15,60 @@ import org.opensearch.core.common.io.stream.NotSerializableExceptionWrapper; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.common.exception.ClientException; +import org.opensearch.timeseries.common.exception.DuplicateTaskException; +import org.opensearch.timeseries.common.exception.InternalFailure; +import org.opensearch.timeseries.common.exception.NotSerializedExceptionName; +import org.opensearch.timeseries.common.exception.TaskCancelledException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.common.exception.ValidationException; public class NotSerializedADExceptionNameTests extends OpenSearchTestCase { public void testConvertAnomalyDetectionException() { - Optional converted = NotSerializedADExceptionName - .convertWrappedAnomalyDetectionException(new NotSerializableExceptionWrapper(new AnomalyDetectionException("", "")), ""); + Optional converted = NotSerializedExceptionName + .convertWrappedTimeSeriesException(new NotSerializableExceptionWrapper(new TimeSeriesException("", "")), ""); assertTrue(converted.isPresent()); - assertTrue(converted.get() instanceof AnomalyDetectionException); + assertTrue(converted.get() instanceof TimeSeriesException); } public void testConvertInternalFailure() { - Optional converted = NotSerializedADExceptionName - .convertWrappedAnomalyDetectionException(new NotSerializableExceptionWrapper(new InternalFailure("", "")), ""); + Optional converted = NotSerializedExceptionName + .convertWrappedTimeSeriesException(new NotSerializableExceptionWrapper(new InternalFailure("", "")), ""); assertTrue(converted.isPresent()); assertTrue(converted.get() instanceof InternalFailure); } public void testConvertClientException() { - Optional converted = NotSerializedADExceptionName - .convertWrappedAnomalyDetectionException(new NotSerializableExceptionWrapper(new ClientException("", "")), ""); + Optional converted = NotSerializedExceptionName + .convertWrappedTimeSeriesException(new NotSerializableExceptionWrapper(new ClientException("", "")), ""); assertTrue(converted.isPresent()); assertTrue(converted.get() instanceof ClientException); } public void testConvertADTaskCancelledException() { - Optional converted = NotSerializedADExceptionName - .convertWrappedAnomalyDetectionException(new NotSerializableExceptionWrapper(new ADTaskCancelledException("", "")), ""); + Optional converted = NotSerializedExceptionName + .convertWrappedTimeSeriesException(new NotSerializableExceptionWrapper(new TaskCancelledException("", "")), ""); assertTrue(converted.isPresent()); - assertTrue(converted.get() instanceof ADTaskCancelledException); + assertTrue(converted.get() instanceof TaskCancelledException); } public void testConvertDuplicateTaskException() { - Optional converted = NotSerializedADExceptionName - .convertWrappedAnomalyDetectionException(new NotSerializableExceptionWrapper(new DuplicateTaskException("")), ""); + Optional converted = NotSerializedExceptionName + .convertWrappedTimeSeriesException(new NotSerializableExceptionWrapper(new DuplicateTaskException("")), ""); assertTrue(converted.isPresent()); assertTrue(converted.get() instanceof DuplicateTaskException); } public void testConvertADValidationException() { - Optional converted = NotSerializedADExceptionName - .convertWrappedAnomalyDetectionException(new NotSerializableExceptionWrapper(new ADValidationException("", null, null)), ""); + Optional converted = NotSerializedExceptionName + .convertWrappedTimeSeriesException(new NotSerializableExceptionWrapper(new ValidationException("", null, null)), ""); assertTrue(converted.isPresent()); - assertTrue(converted.get() instanceof ADValidationException); + assertTrue(converted.get() instanceof ValidationException); } public void testUnknownException() { - Optional converted = NotSerializedADExceptionName - .convertWrappedAnomalyDetectionException(new NotSerializableExceptionWrapper(new RuntimeException("")), ""); + Optional converted = NotSerializedExceptionName + .convertWrappedTimeSeriesException(new NotSerializableExceptionWrapper(new RuntimeException("")), ""); assertTrue(!converted.isPresent()); } } diff --git a/src/test/java/org/opensearch/ad/dataprocessor/SingleFeatureLinearUniformInterpolatorTests.java b/src/test/java/org/opensearch/ad/dataprocessor/SingleFeatureLinearUniformInterpolatorTests.java deleted file mode 100644 index e668b51a0..000000000 --- a/src/test/java/org/opensearch/ad/dataprocessor/SingleFeatureLinearUniformInterpolatorTests.java +++ /dev/null @@ -1,71 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.dataprocessor; - -import static org.junit.Assert.assertArrayEquals; - -import java.util.Arrays; -import java.util.Collection; - -import org.junit.Before; -import org.junit.Test; -import org.junit.runner.RunWith; -import org.junit.runners.Parameterized; -import org.junit.runners.Parameterized.Parameters; - -@RunWith(Parameterized.class) -public class SingleFeatureLinearUniformInterpolatorTests { - - @Parameters - public static Collection data() { - double[] singleComponent = { -1.0, 2.0 }; - double[] multiComponent = { 0.0, 1.0, -1. }; - double oneThird = 1.0 / 3.0; - - return Arrays - .asList( - new Object[][] { - { new double[0], 1, new double[0] }, - { new double[] { 1 }, 2, new double[] { 1, 1 } }, - { singleComponent, 2, singleComponent }, - { singleComponent, 3, new double[] { -1.0, 0.5, 2.0 } }, - { singleComponent, 4, new double[] { -1.0, 0.0, 1.0, 2.0 } }, - { multiComponent, 3, multiComponent }, - { multiComponent, 4, new double[] { 0.0, 2 * oneThird, oneThird, -1.0 } }, - { multiComponent, 5, new double[] { 0.0, 0.5, 1.0, 0.0, -1.0 } }, - { multiComponent, 6, new double[] { 0.0, 0.4, 0.8, 0.6, -0.2, -1.0 } } } - ); - } - - private double[] input; - private int numInterpolants; - private double[] expected; - private SingleFeatureLinearUniformInterpolator interpolator; - - public SingleFeatureLinearUniformInterpolatorTests(double[] input, int numInterpolants, double[] expected) { - this.input = input; - this.numInterpolants = numInterpolants; - this.expected = expected; - } - - @Before - public void setUp() { - this.interpolator = new SingleFeatureLinearUniformInterpolator(); - } - - @Test - public void testInterpolation() { - double[] actual = interpolator.interpolate(input, numInterpolants); - double delta = 1e-8; - assertArrayEquals(expected, actual, delta); - } -} diff --git a/src/test/java/org/opensearch/ad/e2e/AbstractSyntheticDataTest.java b/src/test/java/org/opensearch/ad/e2e/AbstractSyntheticDataTest.java index 6721734b8..3b6d8b482 100644 --- a/src/test/java/org/opensearch/ad/e2e/AbstractSyntheticDataTest.java +++ b/src/test/java/org/opensearch/ad/e2e/AbstractSyntheticDataTest.java @@ -11,9 +11,9 @@ package org.opensearch.ad.e2e; -import static org.opensearch.ad.TestHelpers.toHttpEntity; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.BACKOFF_MINUTES; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE; +import static org.opensearch.timeseries.TestHelpers.toHttpEntity; +import static org.opensearch.timeseries.settings.TimeSeriesSettings.BACKOFF_MINUTES; +import static org.opensearch.timeseries.settings.TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE; import java.io.File; import java.io.FileReader; @@ -29,7 +29,6 @@ import org.apache.http.HttpHeaders; import org.apache.http.message.BasicHeader; import org.opensearch.ad.ODFERestTestCase; -import org.opensearch.ad.TestHelpers; import org.opensearch.client.Request; import org.opensearch.client.RequestOptions; import org.opensearch.client.Response; @@ -38,6 +37,7 @@ import org.opensearch.common.xcontent.json.JsonXContent; import org.opensearch.core.common.Strings; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.TestHelpers; import com.google.common.collect.ImmutableList; import com.google.gson.JsonArray; @@ -45,7 +45,6 @@ import com.google.gson.JsonParser; public class AbstractSyntheticDataTest extends ODFERestTestCase { - /** * In real time AD, we mute a node for a detector if that node keeps returning * ResourceNotFoundException (5 times in a row). This is a problem for batch mode diff --git a/src/test/java/org/opensearch/ad/e2e/DetectionResultEvalutationIT.java b/src/test/java/org/opensearch/ad/e2e/DetectionResultEvalutationIT.java index 8273032e7..e856cd1cd 100644 --- a/src/test/java/org/opensearch/ad/e2e/DetectionResultEvalutationIT.java +++ b/src/test/java/org/opensearch/ad/e2e/DetectionResultEvalutationIT.java @@ -11,7 +11,7 @@ package org.opensearch.ad.e2e; -import static org.opensearch.ad.TestHelpers.toHttpEntity; +import static org.opensearch.timeseries.TestHelpers.toHttpEntity; import java.text.SimpleDateFormat; import java.time.Clock; @@ -27,12 +27,12 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.core.Logger; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonErrorMessages; import org.opensearch.client.Request; import org.opensearch.client.Response; import org.opensearch.client.RestClient; import org.opensearch.common.xcontent.support.XContentMapValues; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.constant.CommonMessages; import com.google.common.collect.ImmutableMap; import com.google.gson.JsonElement; @@ -117,10 +117,7 @@ public void testValidationIntervalRecommendation() throws Exception { @SuppressWarnings("unchecked") Map> messageMap = (Map>) XContentMapValues .extractValue("model", responseMap); - assertEquals( - CommonErrorMessages.DETECTOR_INTERVAL_REC + recDetectorIntervalMinutes, - messageMap.get("detection_interval").get("message") - ); + assertEquals(CommonMessages.INTERVAL_REC + recDetectorIntervalMinutes, messageMap.get("detection_interval").get("message")); } public void testValidationWindowDelayRecommendation() throws Exception { @@ -158,7 +155,7 @@ public void testValidationWindowDelayRecommendation() throws Exception { Map> messageMap = (Map>) XContentMapValues .extractValue("model", responseMap); assertEquals( - String.format(Locale.ROOT, CommonErrorMessages.WINDOW_DELAY_REC, expectedWindowDelayMinutes, expectedWindowDelayMinutes), + String.format(Locale.ROOT, CommonMessages.WINDOW_DELAY_REC, expectedWindowDelayMinutes, expectedWindowDelayMinutes), messageMap.get("window_delay").get("message") ); } diff --git a/src/test/java/org/opensearch/ad/e2e/SingleStreamModelPerfIT.java b/src/test/java/org/opensearch/ad/e2e/SingleStreamModelPerfIT.java index 2cf080a14..23f5f0f95 100644 --- a/src/test/java/org/opensearch/ad/e2e/SingleStreamModelPerfIT.java +++ b/src/test/java/org/opensearch/ad/e2e/SingleStreamModelPerfIT.java @@ -11,7 +11,7 @@ package org.opensearch.ad.e2e; -import static org.opensearch.ad.TestHelpers.toHttpEntity; +import static org.opensearch.timeseries.TestHelpers.toHttpEntity; import java.io.File; import java.io.FileReader; @@ -32,8 +32,8 @@ import org.apache.http.message.BasicHeader; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.core.Logger; -import org.opensearch.ad.TestHelpers; import org.opensearch.client.RestClient; +import org.opensearch.timeseries.TestHelpers; import com.google.common.collect.ImmutableList; import com.google.gson.JsonArray; diff --git a/src/test/java/org/opensearch/ad/feature/FeatureManagerTests.java b/src/test/java/org/opensearch/ad/feature/FeatureManagerTests.java index 8a3795dcd..b78647c11 100644 --- a/src/test/java/org/opensearch/ad/feature/FeatureManagerTests.java +++ b/src/test/java/org/opensearch/ad/feature/FeatureManagerTests.java @@ -52,17 +52,18 @@ import org.mockito.ArgumentCaptor; import org.mockito.Mock; import org.mockito.MockitoAnnotations; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.dataprocessor.Interpolator; -import org.opensearch.ad.dataprocessor.LinearUniformInterpolator; -import org.opensearch.ad.dataprocessor.SingleFeatureLinearUniformInterpolator; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.ad.util.ArrayEqMatcher; import org.opensearch.core.action.ActionListener; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.dataprocessor.Imputer; +import org.opensearch.timeseries.dataprocessor.LinearUniformImputer; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; import junitparams.JUnitParamsRunner; import junitparams.Parameters; @@ -91,7 +92,7 @@ public class FeatureManagerTests { private SearchFeatureDao searchFeatureDao; @Mock - private Interpolator interpolator; + private Imputer imputer; @Mock private Clock clock; @@ -119,17 +120,17 @@ public void setup() { featureBufferTtl = Duration.ofMillis(1_000L); detectorId = "id"; - when(detector.getDetectorId()).thenReturn(detectorId); + when(detector.getId()).thenReturn(detectorId); when(detector.getShingleSize()).thenReturn(shingleSize); IntervalTimeConfiguration detectorIntervalTimeConfig = new IntervalTimeConfiguration(1, ChronoUnit.MINUTES); intervalInMilliseconds = detectorIntervalTimeConfig.toDuration().toMillis(); - when(detector.getDetectorIntervalInMilliseconds()).thenReturn(intervalInMilliseconds); + when(detector.getIntervalInMilliseconds()).thenReturn(intervalInMilliseconds); - Interpolator interpolator = new LinearUniformInterpolator(new SingleFeatureLinearUniformInterpolator()); + Imputer imputer = new LinearUniformImputer(false); ExecutorService executorService = mock(ExecutorService.class); - when(threadPool.executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); + when(threadPool.executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); doAnswer(invocation -> { Runnable runnable = invocation.getArgument(0); runnable.run(); @@ -139,7 +140,7 @@ public void setup() { this.featureManager = spy( new FeatureManager( searchFeatureDao, - interpolator, + imputer, clock, maxTrainSamples, maxSampleStride, @@ -151,7 +152,7 @@ public void setup() { maxPreviewSamples, featureBufferTtl, threadPool, - AnomalyDetectorPlugin.AD_THREAD_POOL_NAME + TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME ) ); } @@ -196,7 +197,7 @@ public void getColdStartData_returnExpectedToListener( double[][] expected ) throws Exception { long detectionInterval = (new IntervalTimeConfiguration(15, ChronoUnit.MINUTES)).toDuration().toMillis(); - when(detector.getDetectorIntervalInMilliseconds()).thenReturn(detectionInterval); + when(detector.getIntervalInMilliseconds()).thenReturn(detectionInterval); when(detector.getShingleSize()).thenReturn(4); doAnswer(invocation -> { ActionListener> listener = invocation.getArgument(1); @@ -205,17 +206,19 @@ public void getColdStartData_returnExpectedToListener( }).when(searchFeatureDao).getLatestDataTime(eq(detector), any(ActionListener.class)); if (latestTime != null) { doAnswer(invocation -> { - ActionListener>> listener = invocation.getArgument(2); + ActionListener>> listener = invocation.getArgument(3); listener.onResponse(samples); return null; - }).when(searchFeatureDao).getFeatureSamplesForPeriods(eq(detector), eq(sampleRanges), any(ActionListener.class)); + }) + .when(searchFeatureDao) + .getFeatureSamplesForPeriods(eq(detector), eq(sampleRanges), eq(AnalysisType.AD), any(ActionListener.class)); } ActionListener> listener = mock(ActionListener.class); featureManager = spy( new FeatureManager( searchFeatureDao, - interpolator, + imputer, clock, maxTrainSamples, maxSampleStride, @@ -227,7 +230,7 @@ public void getColdStartData_returnExpectedToListener( maxPreviewSamples, featureBufferTtl, threadPool, - AnomalyDetectorPlugin.AD_THREAD_POOL_NAME + TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME ) ); featureManager.getColdStartData(detector, listener); @@ -261,7 +264,9 @@ public void getColdStartData_throwToListener_onQueryCreationError() throws Excep listener.onResponse(Optional.ofNullable(0L)); return null; }).when(searchFeatureDao).getLatestDataTime(eq(detector), any(ActionListener.class)); - doThrow(IOException.class).when(searchFeatureDao).getFeatureSamplesForPeriods(eq(detector), any(), any(ActionListener.class)); + doThrow(IOException.class) + .when(searchFeatureDao) + .getFeatureSamplesForPeriods(eq(detector), any(), eq(AnalysisType.AD), any(ActionListener.class)); ActionListener> listener = mock(ActionListener.class); featureManager.getColdStartData(detector, listener); @@ -320,7 +325,7 @@ public void clear_deleteFeatures() throws IOException { AtomicBoolean firstQuery = new AtomicBoolean(true); doAnswer(invocation -> { - ActionListener>> daoListener = invocation.getArgument(2); + ActionListener>> daoListener = invocation.getArgument(3); if (firstQuery.get()) { firstQuery.set(false); daoListener @@ -329,14 +334,16 @@ public void clear_deleteFeatures() throws IOException { daoListener.onResponse(asList(Optional.ofNullable(null), Optional.ofNullable(null), Optional.of(new double[] { 1 }))); } return null; - }).when(searchFeatureDao).getFeatureSamplesForPeriods(eq(detector), any(List.class), any(ActionListener.class)); + }) + .when(searchFeatureDao) + .getFeatureSamplesForPeriods(eq(detector), any(List.class), eq(AnalysisType.AD), any(ActionListener.class)); featureManager.getCurrentFeatures(detector, start, end, mock(ActionListener.class)); SinglePointFeatures beforeMaintenance = getCurrentFeatures(detector, start, end); assertTrue(beforeMaintenance.getUnprocessedFeatures().isPresent()); assertTrue(beforeMaintenance.getProcessedFeatures().isPresent()); - featureManager.clear(detector.getDetectorId()); + featureManager.clear(detector.getId()); SinglePointFeatures afterMaintenance = getCurrentFeatures(detector, start, end); assertTrue(afterMaintenance.getUnprocessedFeatures().isPresent()); @@ -359,7 +366,7 @@ public void maintenance_removeStaleData() throws IOException { AtomicBoolean firstQuery = new AtomicBoolean(true); doAnswer(invocation -> { - ActionListener>> daoListener = invocation.getArgument(2); + ActionListener>> daoListener = invocation.getArgument(3); if (firstQuery.get()) { firstQuery.set(false); daoListener @@ -368,7 +375,9 @@ public void maintenance_removeStaleData() throws IOException { daoListener.onResponse(asList(Optional.ofNullable(null), Optional.ofNullable(null), Optional.of(new double[] { 1 }))); } return null; - }).when(searchFeatureDao).getFeatureSamplesForPeriods(eq(detector), any(List.class), any(ActionListener.class)); + }) + .when(searchFeatureDao) + .getFeatureSamplesForPeriods(eq(detector), any(List.class), eq(AnalysisType.AD), any(ActionListener.class)); featureManager.getCurrentFeatures(detector, start, end, mock(ActionListener.class)); SinglePointFeatures beforeMaintenance = getCurrentFeatures(detector, start, end); @@ -391,7 +400,7 @@ public void maintenance_keepRecentData() throws IOException { AtomicBoolean firstQuery = new AtomicBoolean(true); doAnswer(invocation -> { - ActionListener>> daoListener = invocation.getArgument(2); + ActionListener>> daoListener = invocation.getArgument(3); if (firstQuery.get()) { firstQuery.set(false); daoListener @@ -400,7 +409,9 @@ public void maintenance_keepRecentData() throws IOException { daoListener.onResponse(asList(Optional.ofNullable(null), Optional.ofNullable(null), Optional.of(new double[] { 1 }))); } return null; - }).when(searchFeatureDao).getFeatureSamplesForPeriods(eq(detector), any(List.class), any(ActionListener.class)); + }) + .when(searchFeatureDao) + .getFeatureSamplesForPeriods(eq(detector), any(List.class), eq(AnalysisType.AD), any(ActionListener.class)); featureManager.getCurrentFeatures(detector, start, end, mock(ActionListener.class)); SinglePointFeatures beforeMaintenance = getCurrentFeatures(detector, start, end); @@ -428,7 +439,7 @@ private void getPreviewFeaturesTemplate(List> samplesResults, long start = 0L; long end = 240_000L; long detectionInterval = (new IntervalTimeConfiguration(1, ChronoUnit.MINUTES)).toDuration().toMillis(); - when(detector.getDetectorIntervalInMilliseconds()).thenReturn(detectionInterval); + when(detector.getIntervalInMilliseconds()).thenReturn(detectionInterval); List> sampleRanges = Arrays.asList(new SimpleEntry<>(0L, 60_000L), new SimpleEntry<>(120_000L, 180_000L)); doAnswer(invocation -> { @@ -436,8 +447,8 @@ private void getPreviewFeaturesTemplate(List> samplesResults, ActionListener>> listener = null; - if (args[2] instanceof ActionListener) { - listener = (ActionListener>>) args[2]; + if (args[3] instanceof ActionListener) { + listener = (ActionListener>>) args[3]; } if (querySuccess) { @@ -447,13 +458,12 @@ private void getPreviewFeaturesTemplate(List> samplesResults, } return null; - }).when(searchFeatureDao).getFeatureSamplesForPeriods(eq(detector), eq(sampleRanges), any()); + }).when(searchFeatureDao).getFeatureSamplesForPeriods(eq(detector), eq(sampleRanges), eq(AnalysisType.AD), any()); - when(interpolator.interpolate(argThat(new ArrayEqMatcher<>(new double[][] { { 1, 3 } })), eq(3))) - .thenReturn(new double[][] { { 1, 2, 3 } }); - when(interpolator.interpolate(argThat(new ArrayEqMatcher<>(new double[][] { { 0, 120000 } })), eq(3))) + when(imputer.impute(argThat(new ArrayEqMatcher<>(new double[][] { { 1, 3 } })), eq(3))).thenReturn(new double[][] { { 1, 2, 3 } }); + when(imputer.impute(argThat(new ArrayEqMatcher<>(new double[][] { { 0, 120000 } })), eq(3))) .thenReturn(new double[][] { { 0, 60000, 120000 } }); - when(interpolator.interpolate(argThat(new ArrayEqMatcher<>(new double[][] { { 60000, 180000 } })), eq(3))) + when(imputer.impute(argThat(new ArrayEqMatcher<>(new double[][] { { 60000, 180000 } })), eq(3))) .thenReturn(new double[][] { { 60000, 120000, 180000 } }); ActionListener listener = mock(ActionListener.class); @@ -498,10 +508,10 @@ public void getPreviewFeatureForEntity() throws IOException { coldStartSamples.add(Optional.of(new double[] { 30.0 })); doAnswer(invocation -> { - ActionListener>> listener = invocation.getArgument(4); + ActionListener>> listener = invocation.getArgument(5); listener.onResponse(coldStartSamples); return null; - }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), any()); + }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), eq(AnalysisType.AD), any()); ActionListener listener = mock(ActionListener.class); @@ -522,10 +532,10 @@ public void getPreviewFeatureForEntity_noDataToPreview() throws IOException { Entity entity = Entity.createSingleAttributeEntity("fieldName", "value"); doAnswer(invocation -> { - ActionListener>> listener = invocation.getArgument(4); + ActionListener>> listener = invocation.getArgument(5); listener.onResponse(new ArrayList<>()); return null; - }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), any()); + }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), eq(AnalysisType.AD), any()); ActionListener listener = mock(ActionListener.class); @@ -562,7 +572,7 @@ private void setupSearchFeatureDaoForGetCurrentFeatures( AtomicBoolean isPreQuery = new AtomicBoolean(true); doAnswer(invocation -> { - ActionListener>> daoListener = invocation.getArgument(2); + ActionListener>> daoListener = invocation.getArgument(3); if (isPreQuery.get()) { isPreQuery.set(false); daoListener.onResponse(preQueryResponse); @@ -574,7 +584,9 @@ private void setupSearchFeatureDaoForGetCurrentFeatures( } } return null; - }).when(searchFeatureDao).getFeatureSamplesForPeriods(eq(detector), any(List.class), any(ActionListener.class)); + }) + .when(searchFeatureDao) + .getFeatureSamplesForPeriods(eq(detector), any(List.class), eq(AnalysisType.AD), any(ActionListener.class)); } private Object[] getCurrentFeaturesTestData_whenAfterQueryResultsFormFullShingle() { @@ -617,7 +629,7 @@ public void getCurrentFeatures_returnExpectedProcessedFeatures_whenAfterQueryRes // Start test SinglePointFeatures listenerResponse = getCurrentFeatures(detector, testStartTime, testEndTime); verify(searchFeatureDao, times(expectedNumQueriesToSearchFeatureDao)) - .getFeatureSamplesForPeriods(eq(detector), any(List.class), any(ActionListener.class)); + .getFeatureSamplesForPeriods(eq(detector), any(List.class), eq(AnalysisType.AD), any(ActionListener.class)); assertTrue(listenerResponse.getUnprocessedFeatures().isPresent()); assertTrue(listenerResponse.getProcessedFeatures().isPresent()); @@ -654,7 +666,7 @@ public void getCurrentFeatures_returnExpectedProcessedFeatures_whenNoQueryNeeded // Start test SinglePointFeatures listenerResponse = getCurrentFeatures(detector, start, end); verify(searchFeatureDao, times(expectedNumQueriesToSearchFeatureDao)) - .getFeatureSamplesForPeriods(eq(detector), any(List.class), any(ActionListener.class)); + .getFeatureSamplesForPeriods(eq(detector), any(List.class), eq(AnalysisType.AD), any(ActionListener.class)); assertTrue(listenerResponse.getUnprocessedFeatures().isPresent()); assertTrue(listenerResponse.getProcessedFeatures().isPresent()); @@ -714,7 +726,7 @@ public void getCurrentFeatures_returnExpectedProcessedFeatures_whenAfterQueryRes // Start test SinglePointFeatures listenerResponse = getCurrentFeatures(detector, testStartTime, testEndTime); verify(searchFeatureDao, times(expectedNumQueriesToSearchFeatureDao)) - .getFeatureSamplesForPeriods(eq(detector), any(List.class), any(ActionListener.class)); + .getFeatureSamplesForPeriods(eq(detector), any(List.class), eq(AnalysisType.AD), any(ActionListener.class)); assertTrue(listenerResponse.getUnprocessedFeatures().isPresent()); assertTrue(listenerResponse.getProcessedFeatures().isPresent()); @@ -760,7 +772,7 @@ public void getCurrentFeatures_returnNoProcessedOrUnprocessedFeatures_whenMissin // Start test SinglePointFeatures listenerResponse = getCurrentFeatures(detector, testStartTime, testEndTime); verify(searchFeatureDao, times(expectedNumQueriesToSearchFeatureDao)) - .getFeatureSamplesForPeriods(eq(detector), any(List.class), any(ActionListener.class)); + .getFeatureSamplesForPeriods(eq(detector), any(List.class), eq(AnalysisType.AD), any(ActionListener.class)); assertFalse(listenerResponse.getUnprocessedFeatures().isPresent()); assertFalse(listenerResponse.getProcessedFeatures().isPresent()); } @@ -797,7 +809,7 @@ public void getCurrentFeatures_returnNoProcessedFeatures_whenAfterQueryResultsCa // Start test SinglePointFeatures listenerResponse = getCurrentFeatures(detector, testStartTime, testEndTime); verify(searchFeatureDao, times(expectedNumQueriesToSearchFeatureDao)) - .getFeatureSamplesForPeriods(eq(detector), any(List.class), any(ActionListener.class)); + .getFeatureSamplesForPeriods(eq(detector), any(List.class), eq(AnalysisType.AD), any(ActionListener.class)); assertTrue(listenerResponse.getUnprocessedFeatures().isPresent()); assertFalse(listenerResponse.getProcessedFeatures().isPresent()); } @@ -828,7 +840,7 @@ public void getCurrentFeatures_returnExceptionToListener_whenQueryThrowsIOExcept ActionListener listener = mock(ActionListener.class); featureManager.getCurrentFeatures(detector, testStartTime, testEndTime, listener); verify(searchFeatureDao, times(expectedNumQueriesToSearchFeatureDao)) - .getFeatureSamplesForPeriods(eq(detector), any(List.class), any(ActionListener.class)); + .getFeatureSamplesForPeriods(eq(detector), any(List.class), eq(AnalysisType.AD), any(ActionListener.class)); verify(listener).onFailure(any(IOException.class)); } @@ -861,12 +873,17 @@ public void getCurrentFeatures_returnExpectedFeatures_cacheMissingData( // first call to cache missing points featureManager.getCurrentFeatures(detector, firstStartTime, firstEndTime, mock(ActionListener.class)); verify(searchFeatureDao, times(1)) - .getFeatureSamplesForPeriods(eq(detector), argThat(list -> list.size() == shingleSize), any(ActionListener.class)); + .getFeatureSamplesForPeriods( + eq(detector), + argThat(list -> list.size() == shingleSize), + eq(AnalysisType.AD), + any(ActionListener.class) + ); // second call should only fetch current point even if previous points missing SinglePointFeatures listenerResponse = getCurrentFeatures(detector, secondStartTime, secondEndTime); verify(searchFeatureDao, times(1)) - .getFeatureSamplesForPeriods(eq(detector), argThat(list -> list.size() == 1), any(ActionListener.class)); + .getFeatureSamplesForPeriods(eq(detector), argThat(list -> list.size() == 1), eq(AnalysisType.AD), any(ActionListener.class)); assertTrue(listenerResponse.getUnprocessedFeatures().isPresent()); if (expectedProcessedFeaturesOptional.isPresent()) { @@ -944,7 +961,7 @@ public void getCurrentFeatures_returnExpectedFeatures_withTimeJitterUpToHalfInte // Start test SinglePointFeatures listenerResponse = getCurrentFeatures(detector, testStartTime, testEndTime); verify(searchFeatureDao, times(expectedNumQueriesToSearchFeatureDao)) - .getFeatureSamplesForPeriods(eq(detector), any(List.class), any(ActionListener.class)); + .getFeatureSamplesForPeriods(eq(detector), any(List.class), eq(AnalysisType.AD), any(ActionListener.class)); assertTrue(listenerResponse.getUnprocessedFeatures().isPresent()); assertTrue(listenerResponse.getProcessedFeatures().isPresent()); @@ -979,19 +996,21 @@ public void getCurrentFeatures_setsShingleSizeFromDetectorConfig(int shingleSize List> ranges = invocation.getArgument(1); assertEquals(ranges.size(), shingleSize); - ActionListener>> daoListener = invocation.getArgument(2); + ActionListener>> daoListener = invocation.getArgument(3); List> response = new ArrayList>(); for (int i = 0; i < ranges.size(); i++) { response.add(Optional.of(new double[] { i })); } daoListener.onResponse(response); return null; - }).when(searchFeatureDao).getFeatureSamplesForPeriods(eq(detector), any(List.class), any(ActionListener.class)); + }) + .when(searchFeatureDao) + .getFeatureSamplesForPeriods(eq(detector), any(List.class), eq(AnalysisType.AD), any(ActionListener.class)); SinglePointFeatures listenerResponse = getCurrentFeatures(detector, 0, intervalInMilliseconds); assertTrue(listenerResponse.getProcessedFeatures().isPresent()); assertEquals(listenerResponse.getProcessedFeatures().get().length, shingleSize); - assertEquals(featureManager.getShingleSize(detector.getDetectorId()), shingleSize); + assertEquals(featureManager.getShingleSize(detector.getId()), shingleSize); } @Test diff --git a/src/test/java/org/opensearch/ad/indices/AnomalyDetectionIndicesTests.java b/src/test/java/org/opensearch/ad/indices/AnomalyDetectionIndicesTests.java index 1272bdc20..eb68bbe7f 100644 --- a/src/test/java/org/opensearch/ad/indices/AnomalyDetectionIndicesTests.java +++ b/src/test/java/org/opensearch/ad/indices/AnomalyDetectionIndicesTests.java @@ -16,127 +16,126 @@ import java.util.Collections; import org.junit.Before; -import org.opensearch.action.index.IndexRequest; -import org.opensearch.action.index.IndexResponse; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.DiscoveryNodeFilterer; -import org.opensearch.ad.util.RestHandlerUtils; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.common.settings.Settings; import org.opensearch.common.unit.TimeValue; -import org.opensearch.common.xcontent.XContentFactory; -import org.opensearch.core.rest.RestStatus; -import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.plugins.Plugin; -import org.opensearch.test.OpenSearchIntegTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.indices.IndexManagementIntegTestCase; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; -public class AnomalyDetectionIndicesTests extends OpenSearchIntegTestCase { +public class AnomalyDetectionIndicesTests extends IndexManagementIntegTestCase { - private AnomalyDetectionIndices indices; + private ADIndexManagement indices; private Settings settings; private DiscoveryNodeFilterer nodeFilter; - // help register setting using AnomalyDetectorPlugin.getSettings. Otherwise, AnomalyDetectionIndices's constructor would fail due to - // unregistered settings like AD_RESULT_HISTORY_MAX_DOCS. + // help register setting using TimeSeriesAnalyticsPlugin.getSettings. + // Otherwise, ADIndexManagement's constructor would fail due to + // unregistered settings like AD_RESULT_HISTORY_MAX_DOCS_PER_SHARD. @Override protected Collection> nodePlugins() { - return Collections.singletonList(AnomalyDetectorPlugin.class); + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); } @Before - public void setup() { + public void setup() throws IOException { settings = Settings .builder() .put("plugins.anomaly_detection.ad_result_history_rollover_period", TimeValue.timeValueHours(12)) - .put("plugins.anomaly_detection.ad_result_history_max_age", TimeValue.timeValueHours(24)) + .put("plugins.anomaly_detection.ad_result_history_retention_period", TimeValue.timeValueHours(24)) .put("plugins.anomaly_detection.ad_result_history_max_docs", 10000L) .put("plugins.anomaly_detection.request_timeout", TimeValue.timeValueSeconds(10)) .build(); nodeFilter = new DiscoveryNodeFilterer(clusterService()); - indices = new AnomalyDetectionIndices( + indices = new ADIndexManagement( client(), clusterService(), client().threadPool(), settings, nodeFilter, - AnomalyDetectorSettings.MAX_UPDATE_RETRY_TIMES + TimeSeriesSettings.MAX_UPDATE_RETRY_TIMES ); } public void testAnomalyDetectorIndexNotExists() { - boolean exists = indices.doesAnomalyDetectorIndexExist(); + boolean exists = indices.doesConfigIndexExist(); assertFalse(exists); } public void testAnomalyDetectorIndexExists() throws IOException { - indices.initAnomalyDetectorIndexIfAbsent(TestHelpers.createActionListener(response -> { + indices.initConfigIndexIfAbsent(TestHelpers.createActionListener(response -> { boolean acknowledged = response.isAcknowledged(); assertTrue(acknowledged); }, failure -> { throw new RuntimeException("should not recreate index"); })); - TestHelpers.waitForIndexCreationToComplete(client(), AnomalyDetector.ANOMALY_DETECTORS_INDEX); + TestHelpers.waitForIndexCreationToComplete(client(), CommonName.CONFIG_INDEX); } public void testAnomalyDetectorIndexExistsAndNotRecreate() throws IOException { - indices.initAnomalyDetectorIndexIfAbsent(TestHelpers.createActionListener(response -> response.isAcknowledged(), failure -> { + indices.initConfigIndexIfAbsent(TestHelpers.createActionListener(response -> response.isAcknowledged(), failure -> { throw new RuntimeException("should not recreate index"); })); - TestHelpers.waitForIndexCreationToComplete(client(), AnomalyDetector.ANOMALY_DETECTORS_INDEX); - if (client().admin().indices().prepareExists(AnomalyDetector.ANOMALY_DETECTORS_INDEX).get().isExists()) { - indices.initAnomalyDetectorIndexIfAbsent(TestHelpers.createActionListener(response -> { - throw new RuntimeException("should not recreate index " + AnomalyDetector.ANOMALY_DETECTORS_INDEX); - }, failure -> { throw new RuntimeException("should not recreate index " + AnomalyDetector.ANOMALY_DETECTORS_INDEX); })); + TestHelpers.waitForIndexCreationToComplete(client(), CommonName.CONFIG_INDEX); + if (client().admin().indices().prepareExists(CommonName.CONFIG_INDEX).get().isExists()) { + indices.initConfigIndexIfAbsent(TestHelpers.createActionListener(response -> { + throw new RuntimeException("should not recreate index " + CommonName.CONFIG_INDEX); + }, failure -> { throw new RuntimeException("should not recreate index " + CommonName.CONFIG_INDEX); })); } } public void testAnomalyResultIndexNotExists() { - boolean exists = indices.doesDefaultAnomalyResultIndexExist(); + boolean exists = indices.doesDefaultResultIndexExist(); assertFalse(exists); } public void testAnomalyResultIndexExists() throws IOException { - indices.initDefaultAnomalyResultIndexIfAbsent(TestHelpers.createActionListener(response -> { + indices.initDefaultResultIndexIfAbsent(TestHelpers.createActionListener(response -> { boolean acknowledged = response.isAcknowledged(); assertTrue(acknowledged); }, failure -> { throw new RuntimeException("should not recreate index"); })); - TestHelpers.waitForIndexCreationToComplete(client(), CommonName.ANOMALY_RESULT_INDEX_ALIAS); + TestHelpers.waitForIndexCreationToComplete(client(), ADCommonName.ANOMALY_RESULT_INDEX_ALIAS); } public void testAnomalyResultIndexExistsAndNotRecreate() throws IOException { indices - .initDefaultAnomalyResultIndexIfAbsent( + .initDefaultResultIndexIfAbsent( TestHelpers.createActionListener(response -> logger.info("Acknowledged: " + response.isAcknowledged()), failure -> { throw new RuntimeException("should not recreate index"); }) ); - TestHelpers.waitForIndexCreationToComplete(client(), CommonName.ANOMALY_RESULT_INDEX_ALIAS); - if (client().admin().indices().prepareExists(CommonName.ANOMALY_RESULT_INDEX_ALIAS).get().isExists()) { - indices.initDefaultAnomalyResultIndexIfAbsent(TestHelpers.createActionListener(response -> { - throw new RuntimeException("should not recreate index " + CommonName.ANOMALY_RESULT_INDEX_ALIAS); - }, failure -> { throw new RuntimeException("should not recreate index " + CommonName.ANOMALY_RESULT_INDEX_ALIAS, failure); })); + TestHelpers.waitForIndexCreationToComplete(client(), ADCommonName.ANOMALY_RESULT_INDEX_ALIAS); + if (client().admin().indices().prepareExists(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS).get().isExists()) { + indices.initDefaultResultIndexIfAbsent(TestHelpers.createActionListener(response -> { + throw new RuntimeException("should not recreate index " + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS); + }, failure -> { throw new RuntimeException("should not recreate index " + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, failure); }) + ); } } - private void createRandomDetector(String indexName) throws IOException { - // creates a random anomaly detector and indexes it - AnomalyDetector detector = TestHelpers.randomAnomalyDetector(TestHelpers.randomUiMetadata(), null); - - XContentBuilder xContentBuilder = XContentFactory.jsonBuilder(); - detector.toXContent(xContentBuilder, RestHandlerUtils.XCONTENT_WITH_TYPE); - - IndexResponse indexResponse = client().index(new IndexRequest(indexName).source(xContentBuilder)).actionGet(); - assertEquals("Doc was not created", RestStatus.CREATED, indexResponse.status()); - } - public void testGetDetectionStateIndexMapping() throws IOException { - String detectorIndexMappings = AnomalyDetectionIndices.getAnomalyDetectorMappings(); + String detectorIndexMappings = ADIndexManagement.getConfigMappings(); detectorIndexMappings = detectorIndexMappings .substring(detectorIndexMappings.indexOf("\"properties\""), detectorIndexMappings.lastIndexOf("}")); - String detectionStateIndexMapping = AnomalyDetectionIndices.getDetectionStateMappings(); + String detectionStateIndexMapping = ADIndexManagement.getStateMappings(); assertTrue(detectionStateIndexMapping.contains(detectorIndexMappings)); } + + public void testValidateCustomIndexForBackendJob() throws IOException, InterruptedException { + String resultMapping = ADIndexManagement.getResultMappings(); + + validateCustomIndexForBackendJob(indices, resultMapping); + } + + public void testValidateCustomIndexForBackendJobInvalidMapping() { + validateCustomIndexForBackendJobInvalidMapping(indices); + } + + public void testValidateCustomIndexForBackendJobNoIndex() { + validateCustomIndexForBackendJobNoIndex(indices); + } } diff --git a/src/test/java/org/opensearch/ad/indices/CustomIndexTests.java b/src/test/java/org/opensearch/ad/indices/CustomIndexTests.java index 46935ade4..53bea9015 100644 --- a/src/test/java/org/opensearch/ad/indices/CustomIndexTests.java +++ b/src/test/java/org/opensearch/ad/indices/CustomIndexTests.java @@ -22,11 +22,8 @@ import java.util.Map; import org.opensearch.Version; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.constant.CommonName; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.Client; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.ClusterState; @@ -36,9 +33,13 @@ import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.ClusterSettings; import org.opensearch.common.settings.Settings; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; -public class CustomIndexTests extends AbstractADTest { - AnomalyDetectionIndices adIndices; +public class CustomIndexTests extends AbstractTimeSeriesTest { + ADIndexManagement adIndices; Client client; ClusterService clusterService; DiscoveryNodeFilterer nodeFilter; @@ -71,7 +72,7 @@ public void setUp() throws Exception { AnomalyDetectorSettings.AD_RESULT_HISTORY_MAX_DOCS_PER_SHARD, AnomalyDetectorSettings.AD_RESULT_HISTORY_ROLLOVER_PERIOD, AnomalyDetectorSettings.AD_RESULT_HISTORY_RETENTION_PERIOD, - AnomalyDetectorSettings.MAX_PRIMARY_SHARDS + AnomalyDetectorSettings.AD_MAX_PRIMARY_SHARDS ) ) ) @@ -81,13 +82,13 @@ public void setUp() throws Exception { nodeFilter = mock(DiscoveryNodeFilterer.class); - adIndices = new AnomalyDetectionIndices( + adIndices = new ADIndexManagement( client, clusterService, threadPool, settings, nodeFilter, - AnomalyDetectorSettings.MAX_UPDATE_RETRY_TIMES + TimeSeriesSettings.MAX_UPDATE_RETRY_TIMES ); } @@ -109,27 +110,27 @@ private Map createMapping() { Map confidence_mapping = new HashMap<>(); confidence_mapping.put("type", "double"); - mappings.put(AnomalyResult.CONFIDENCE_FIELD, confidence_mapping); + mappings.put(CommonName.CONFIDENCE_FIELD, confidence_mapping); Map data_end_time = new HashMap<>(); data_end_time.put("type", "date"); data_end_time.put("format", "strict_date_time||epoch_millis"); - mappings.put(AnomalyResult.DATA_END_TIME_FIELD, data_end_time); + mappings.put(CommonName.DATA_END_TIME_FIELD, data_end_time); Map data_start_time = new HashMap<>(); data_start_time.put("type", "date"); data_start_time.put("format", "strict_date_time||epoch_millis"); - mappings.put(AnomalyResult.DATA_START_TIME_FIELD, data_start_time); + mappings.put(CommonName.DATA_START_TIME_FIELD, data_start_time); Map exec_start_mapping = new HashMap<>(); exec_start_mapping.put("type", "date"); exec_start_mapping.put("format", "strict_date_time||epoch_millis"); - mappings.put(AnomalyResult.EXECUTION_START_TIME_FIELD, exec_start_mapping); + mappings.put(CommonName.EXECUTION_START_TIME_FIELD, exec_start_mapping); Map exec_end_mapping = new HashMap<>(); exec_end_mapping.put("type", "date"); exec_end_mapping.put("format", "strict_date_time||epoch_millis"); - mappings.put(AnomalyResult.EXECUTION_END_TIME_FIELD, exec_end_mapping); + mappings.put(CommonName.EXECUTION_END_TIME_FIELD, exec_end_mapping); Map detector_id_mapping = new HashMap<>(); detector_id_mapping.put("type", "keyword"); @@ -141,11 +142,11 @@ private Map createMapping() { entity_nested_mapping.put("name", Collections.singletonMap("type", "keyword")); entity_nested_mapping.put("value", Collections.singletonMap("type", "keyword")); entity_mapping.put(CommonName.PROPERTIES, entity_nested_mapping); - mappings.put(AnomalyResult.ENTITY_FIELD, entity_mapping); + mappings.put(CommonName.ENTITY_FIELD, entity_mapping); Map error_mapping = new HashMap<>(); error_mapping.put("type", "text"); - mappings.put(AnomalyResult.ERROR_FIELD, error_mapping); + mappings.put(CommonName.ERROR_FIELD, error_mapping); Map expected_mapping = new HashMap<>(); expected_mapping.put("type", "nested"); @@ -167,9 +168,9 @@ private Map createMapping() { feature_mapping.put(CommonName.PROPERTIES, feature_nested_mapping); feature_nested_mapping.put("data", Collections.singletonMap("type", "double")); feature_nested_mapping.put("feature_id", Collections.singletonMap("type", "keyword")); - mappings.put(AnomalyResult.FEATURE_DATA_FIELD, feature_mapping); + mappings.put(CommonName.FEATURE_DATA_FIELD, feature_mapping); mappings.put(AnomalyResult.IS_ANOMALY_FIELD, Collections.singletonMap("type", "boolean")); - mappings.put(AnomalyResult.MODEL_ID_FIELD, Collections.singletonMap("type", "keyword")); + mappings.put(CommonName.MODEL_ID_FIELD, Collections.singletonMap("type", "keyword")); Map past_mapping = new HashMap<>(); past_mapping.put("type", "nested"); @@ -189,7 +190,7 @@ private Map createMapping() { mappings.put(CommonName.SCHEMA_VERSION_FIELD, Collections.singletonMap("type", "integer")); - mappings.put(AnomalyResult.TASK_ID_FIELD, Collections.singletonMap("type", "keyword")); + mappings.put(CommonName.TASK_ID_FIELD, Collections.singletonMap("type", "keyword")); mappings.put(AnomalyResult.THRESHOLD_FIELD, Collections.singletonMap("type", "double")); @@ -216,7 +217,7 @@ private Map createMapping() { roles_mapping.put("type", "text"); roles_mapping.put("fields", Collections.singletonMap("keyword", Collections.singletonMap("type", "keyword"))); user_nested_mapping.put("roles", roles_mapping); - mappings.put(AnomalyResult.USER_FIELD, user_mapping); + mappings.put(CommonName.USER_FIELD, user_mapping); return mappings; } @@ -254,7 +255,7 @@ public void testCorrectReordered() throws IOException { // feature_id comes before data compared with what createMapping returned feature_nested_mapping.put("feature_id", Collections.singletonMap("type", "keyword")); feature_nested_mapping.put("data", Collections.singletonMap("type", "double")); - mappings.put(AnomalyResult.FEATURE_DATA_FIELD, feature_mapping); + mappings.put(CommonName.FEATURE_DATA_FIELD, feature_mapping); IndexMetadata indexMetadata1 = new IndexMetadata.Builder(customIndexName) .settings( diff --git a/src/test/java/org/opensearch/ad/indices/InitAnomalyDetectionIndicesTests.java b/src/test/java/org/opensearch/ad/indices/InitAnomalyDetectionIndicesTests.java index 84fe9bd58..edb932d4c 100644 --- a/src/test/java/org/opensearch/ad/indices/InitAnomalyDetectionIndicesTests.java +++ b/src/test/java/org/opensearch/ad/indices/InitAnomalyDetectionIndicesTests.java @@ -28,32 +28,31 @@ import org.opensearch.action.admin.indices.alias.Alias; import org.opensearch.action.admin.indices.create.CreateIndexRequest; import org.opensearch.action.admin.indices.create.CreateIndexResponse; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.AdminClient; import org.opensearch.client.Client; import org.opensearch.client.IndicesAdminClient; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.ClusterState; import org.opensearch.cluster.metadata.Metadata; -import org.opensearch.cluster.routing.RoutingTable; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.ClusterSettings; import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; -public class InitAnomalyDetectionIndicesTests extends AbstractADTest { +public class InitAnomalyDetectionIndicesTests extends AbstractTimeSeriesTest { Client client; ClusterService clusterService; ThreadPool threadPool; Settings settings; DiscoveryNodeFilterer nodeFilter; - AnomalyDetectionIndices adIndices; + ADIndexManagement adIndices; ClusterName clusterName; ClusterState clusterState; IndicesAdminClient indicesClient; @@ -87,7 +86,7 @@ public void setUp() throws Exception { AnomalyDetectorSettings.AD_RESULT_HISTORY_MAX_DOCS_PER_SHARD, AnomalyDetectorSettings.AD_RESULT_HISTORY_ROLLOVER_PERIOD, AnomalyDetectorSettings.AD_RESULT_HISTORY_RETENTION_PERIOD, - AnomalyDetectorSettings.MAX_PRIMARY_SHARDS + AnomalyDetectorSettings.AD_MAX_PRIMARY_SHARDS ) ) ) @@ -98,13 +97,13 @@ public void setUp() throws Exception { clusterState = ClusterState.builder(clusterName).metadata(Metadata.builder().build()).build(); when(clusterService.state()).thenReturn(clusterState); - adIndices = new AnomalyDetectionIndices( + adIndices = new ADIndexManagement( client, clusterService, threadPool, settings, nodeFilter, - AnomalyDetectorSettings.MAX_UPDATE_RETRY_TIMES + TimeSeriesSettings.MAX_UPDATE_RETRY_TIMES ); } @@ -121,10 +120,10 @@ private void fixedPrimaryShardsIndexCreationTemplate(String index) throws IOExce }).when(indicesClient).create(any(), any()); ActionListener listener = mock(ActionListener.class); - if (index.equals(AnomalyDetector.ANOMALY_DETECTORS_INDEX)) { - adIndices.initAnomalyDetectorIndexIfAbsent(listener); + if (index.equals(CommonName.CONFIG_INDEX)) { + adIndices.initConfigIndexIfAbsent(listener); } else { - adIndices.initDetectionStateIndex(listener); + adIndices.initStateIndex(listener); } ArgumentCaptor captor = ArgumentCaptor.forClass(CreateIndexResponse.class); @@ -134,22 +133,24 @@ private void fixedPrimaryShardsIndexCreationTemplate(String index) throws IOExce } @SuppressWarnings("unchecked") - private void fixedPrimaryShardsIndexNoCreationTemplate(String index, String alias) throws IOException { + private void fixedPrimaryShardsIndexNoCreationTemplate(String index, String... alias) throws IOException { clusterState = mock(ClusterState.class); when(clusterService.state()).thenReturn(clusterState); - RoutingTable.Builder rb = RoutingTable.builder(); - rb.addAsNew(indexMeta(index, 1L)); - when(clusterState.getRoutingTable()).thenReturn(rb.build()); + // RoutingTable.Builder rb = RoutingTable.builder(); + // rb.addAsNew(indexMeta(index, 1L)); + // when(clusterState.metadata()).thenReturn(rb.build()); Metadata.Builder mb = Metadata.builder(); - mb.put(indexMeta(".opendistro-anomaly-results-history-2020.06.24-000003", 1L, CommonName.ANOMALY_RESULT_INDEX_ALIAS), true); + // mb.put(indexMeta(".opendistro-anomaly-results-history-2020.06.24-000003", 1L, ADCommonName.ANOMALY_RESULT_INDEX_ALIAS), true); + mb.put(indexMeta(index, 1L, alias), true); + when(clusterState.metadata()).thenReturn(mb.build()); ActionListener listener = mock(ActionListener.class); - if (index.equals(AnomalyDetector.ANOMALY_DETECTORS_INDEX)) { - adIndices.initAnomalyDetectorIndexIfAbsent(listener); + if (index.equals(CommonName.CONFIG_INDEX)) { + adIndices.initConfigIndexIfAbsent(listener); } else { - adIndices.initDefaultAnomalyResultIndexIfAbsent(listener); + adIndices.initDefaultResultIndexIfAbsent(listener); } verify(indicesClient, never()).create(any(), any()); @@ -160,14 +161,14 @@ private void adaptivePrimaryShardsIndexCreationTemplate(String index) throws IOE doAnswer(invocation -> { CreateIndexRequest request = invocation.getArgument(0); - if (index.equals(CommonName.ANOMALY_RESULT_INDEX_ALIAS)) { - assertTrue(request.aliases().contains(new Alias(CommonName.ANOMALY_RESULT_INDEX_ALIAS))); + if (index.equals(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS)) { + assertTrue(request.aliases().contains(new Alias(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS))); } else { assertEquals(index, request.index()); } Settings settings = request.settings(); - if (index.equals(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX)) { + if (index.equals(CommonName.JOB_INDEX)) { assertThat(settings.get("index.number_of_shards"), equalTo(Integer.toString(1))); } else { assertThat(settings.get("index.number_of_shards"), equalTo(Integer.toString(numberOfHotNodes))); @@ -180,16 +181,16 @@ private void adaptivePrimaryShardsIndexCreationTemplate(String index) throws IOE }).when(indicesClient).create(any(), any()); ActionListener listener = mock(ActionListener.class); - if (index.equals(AnomalyDetector.ANOMALY_DETECTORS_INDEX)) { - adIndices.initAnomalyDetectorIndexIfAbsent(listener); - } else if (index.equals(CommonName.DETECTION_STATE_INDEX)) { - adIndices.initDetectionStateIndex(listener); - } else if (index.equals(CommonName.CHECKPOINT_INDEX_NAME)) { + if (index.equals(CommonName.CONFIG_INDEX)) { + adIndices.initConfigIndexIfAbsent(listener); + } else if (index.equals(ADCommonName.DETECTION_STATE_INDEX)) { + adIndices.initStateIndex(listener); + } else if (index.equals(ADCommonName.CHECKPOINT_INDEX_NAME)) { adIndices.initCheckpointIndex(listener); - } else if (index.equals(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX)) { - adIndices.initAnomalyDetectorJobIndex(listener); + } else if (index.equals(CommonName.JOB_INDEX)) { + adIndices.initJobIndex(listener); } else { - adIndices.initDefaultAnomalyResultIndexIfAbsent(listener); + adIndices.initDefaultResultIndexIfAbsent(listener); } ArgumentCaptor captor = ArgumentCaptor.forClass(CreateIndexResponse.class); @@ -199,30 +200,30 @@ private void adaptivePrimaryShardsIndexCreationTemplate(String index) throws IOE } public void testNotCreateDetector() throws IOException { - fixedPrimaryShardsIndexNoCreationTemplate(AnomalyDetector.ANOMALY_DETECTORS_INDEX, null); + fixedPrimaryShardsIndexNoCreationTemplate(CommonName.CONFIG_INDEX); } public void testNotCreateResult() throws IOException { - fixedPrimaryShardsIndexNoCreationTemplate(AnomalyDetector.ANOMALY_DETECTORS_INDEX, null); + fixedPrimaryShardsIndexNoCreationTemplate(CommonName.CONFIG_INDEX); } public void testCreateDetector() throws IOException { - fixedPrimaryShardsIndexCreationTemplate(AnomalyDetector.ANOMALY_DETECTORS_INDEX); + fixedPrimaryShardsIndexCreationTemplate(CommonName.CONFIG_INDEX); } public void testCreateState() throws IOException { - fixedPrimaryShardsIndexCreationTemplate(CommonName.DETECTION_STATE_INDEX); + fixedPrimaryShardsIndexCreationTemplate(ADCommonName.DETECTION_STATE_INDEX); } public void testCreateJob() throws IOException { - adaptivePrimaryShardsIndexCreationTemplate(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX); + adaptivePrimaryShardsIndexCreationTemplate(CommonName.JOB_INDEX); } public void testCreateResult() throws IOException { - adaptivePrimaryShardsIndexCreationTemplate(CommonName.ANOMALY_RESULT_INDEX_ALIAS); + adaptivePrimaryShardsIndexCreationTemplate(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS); } public void testCreateCheckpoint() throws IOException { - adaptivePrimaryShardsIndexCreationTemplate(CommonName.CHECKPOINT_INDEX_NAME); + adaptivePrimaryShardsIndexCreationTemplate(ADCommonName.CHECKPOINT_INDEX_NAME); } } diff --git a/src/test/java/org/opensearch/ad/indices/RolloverTests.java b/src/test/java/org/opensearch/ad/indices/RolloverTests.java index 7dcebcb84..34cd44c20 100644 --- a/src/test/java/org/opensearch/ad/indices/RolloverTests.java +++ b/src/test/java/org/opensearch/ad/indices/RolloverTests.java @@ -33,10 +33,8 @@ import org.opensearch.action.admin.indices.rollover.RolloverRequest; import org.opensearch.action.admin.indices.rollover.RolloverResponse; import org.opensearch.action.support.master.AcknowledgedResponse; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.AdminClient; import org.opensearch.client.Client; import org.opensearch.client.ClusterAdminClient; @@ -49,9 +47,12 @@ import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; -public class RolloverTests extends AbstractADTest { - private AnomalyDetectionIndices adIndices; +public class RolloverTests extends AbstractTimeSeriesTest { + private ADIndexManagement adIndices; private IndicesAdminClient indicesClient; private ClusterAdminClient clusterAdminClient; private ClusterName clusterName; @@ -77,7 +78,7 @@ public void setUp() throws Exception { AnomalyDetectorSettings.AD_RESULT_HISTORY_MAX_DOCS_PER_SHARD, AnomalyDetectorSettings.AD_RESULT_HISTORY_ROLLOVER_PERIOD, AnomalyDetectorSettings.AD_RESULT_HISTORY_RETENTION_PERIOD, - AnomalyDetectorSettings.MAX_PRIMARY_SHARDS + AnomalyDetectorSettings.AD_MAX_PRIMARY_SHARDS ) ) ) @@ -96,13 +97,13 @@ public void setUp() throws Exception { numberOfNodes = 2; when(nodeFilter.getNumberOfEligibleDataNodes()).thenReturn(numberOfNodes); - adIndices = new AnomalyDetectionIndices( + adIndices = new ADIndexManagement( client, clusterService, threadPool, settings, nodeFilter, - AnomalyDetectorSettings.MAX_UPDATE_RETRY_TIMES + TimeSeriesSettings.MAX_UPDATE_RETRY_TIMES ); clusterAdminClient = mock(ClusterAdminClient.class); @@ -110,7 +111,7 @@ public void setUp() throws Exception { doAnswer(invocation -> { ClusterStateRequest clusterStateRequest = invocation.getArgument(0); - assertEquals(AnomalyDetectionIndices.ALL_AD_RESULTS_INDEX_PATTERN, clusterStateRequest.indices()[0]); + assertEquals(ADIndexManagement.ALL_AD_RESULTS_INDEX_PATTERN, clusterStateRequest.indices()[0]); @SuppressWarnings("unchecked") ActionListener listener = (ActionListener) invocation.getArgument(1); listener.onResponse(new ClusterStateResponse(clusterName, clusterState, true)); @@ -121,14 +122,14 @@ public void setUp() throws Exception { } private void assertRolloverRequest(RolloverRequest request) { - assertEquals(CommonName.ANOMALY_RESULT_INDEX_ALIAS, request.indices()[0]); + assertEquals(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, request.indices()[0]); Map> conditions = request.getConditions(); assertEquals(1, conditions.size()); assertEquals(new MaxDocsCondition(defaultMaxDocs * numberOfNodes), conditions.get(MaxDocsCondition.NAME)); CreateIndexRequest createIndexRequest = request.getCreateIndexRequest(); - assertEquals(AnomalyDetectionIndices.AD_RESULT_HISTORY_INDEX_PATTERN, createIndexRequest.index()); + assertEquals(ADIndexManagement.AD_RESULT_HISTORY_INDEX_PATTERN, createIndexRequest.index()); assertTrue(createIndexRequest.mappings().contains("data_start_time")); } @@ -146,7 +147,7 @@ public void testNotRolledOver() { Metadata.Builder metaBuilder = Metadata .builder() - .put(indexMeta(".opendistro-anomaly-results-history-2020.06.24-000003", 1L, CommonName.ANOMALY_RESULT_INDEX_ALIAS), true); + .put(indexMeta(".opendistro-anomaly-results-history-2020.06.24-000003", 1L, ADCommonName.ANOMALY_RESULT_INDEX_ALIAS), true); clusterState = ClusterState.builder(clusterName).metadata(metaBuilder.build()).build(); when(clusterService.state()).thenReturn(clusterState); @@ -161,14 +162,14 @@ private void setUpRolloverSuccess() { @SuppressWarnings("unchecked") ActionListener listener = (ActionListener) invocation.getArgument(1); - assertEquals(CommonName.ANOMALY_RESULT_INDEX_ALIAS, request.indices()[0]); + assertEquals(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, request.indices()[0]); Map> conditions = request.getConditions(); assertEquals(1, conditions.size()); assertEquals(new MaxDocsCondition(defaultMaxDocs * numberOfNodes), conditions.get(MaxDocsCondition.NAME)); CreateIndexRequest createIndexRequest = request.getCreateIndexRequest(); - assertEquals(AnomalyDetectionIndices.AD_RESULT_HISTORY_INDEX_PATTERN, createIndexRequest.index()); + assertEquals(ADIndexManagement.AD_RESULT_HISTORY_INDEX_PATTERN, createIndexRequest.index()); assertTrue(createIndexRequest.mappings().contains("data_start_time")); listener.onResponse(new RolloverResponse(null, null, Collections.emptyMap(), request.isDryRun(), true, true, true)); return null; @@ -180,12 +181,12 @@ public void testRolledOverButNotDeleted() { Metadata.Builder metaBuilder = Metadata .builder() - .put(indexMeta(".opendistro-anomaly-results-history-2020.06.24-000003", 1L, CommonName.ANOMALY_RESULT_INDEX_ALIAS), true) + .put(indexMeta(".opendistro-anomaly-results-history-2020.06.24-000003", 1L, ADCommonName.ANOMALY_RESULT_INDEX_ALIAS), true) .put( indexMeta( ".opendistro-anomaly-results-history-2020.06.24-000004", Instant.now().toEpochMilli(), - CommonName.ANOMALY_RESULT_INDEX_ALIAS + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS ), true ); @@ -201,13 +202,13 @@ public void testRolledOverButNotDeleted() { private void setUpTriggerDelete() { Metadata.Builder metaBuilder = Metadata .builder() - .put(indexMeta(".opendistro-anomaly-results-history-2020.06.24-000002", 1L, CommonName.ANOMALY_RESULT_INDEX_ALIAS), true) - .put(indexMeta(".opendistro-anomaly-results-history-2020.06.24-000003", 2L, CommonName.ANOMALY_RESULT_INDEX_ALIAS), true) + .put(indexMeta(".opendistro-anomaly-results-history-2020.06.24-000002", 1L, ADCommonName.ANOMALY_RESULT_INDEX_ALIAS), true) + .put(indexMeta(".opendistro-anomaly-results-history-2020.06.24-000003", 2L, ADCommonName.ANOMALY_RESULT_INDEX_ALIAS), true) .put( indexMeta( ".opendistro-anomaly-results-history-2020.06.24-000004", Instant.now().toEpochMilli(), - CommonName.ANOMALY_RESULT_INDEX_ALIAS + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS ), true ); diff --git a/src/test/java/org/opensearch/ad/indices/UpdateMappingTests.java b/src/test/java/org/opensearch/ad/indices/UpdateMappingTests.java index 42c4870e0..10b00426a 100644 --- a/src/test/java/org/opensearch/ad/indices/UpdateMappingTests.java +++ b/src/test/java/org/opensearch/ad/indices/UpdateMappingTests.java @@ -39,10 +39,7 @@ import org.opensearch.action.admin.indices.settings.get.GetSettingsResponse; import org.opensearch.action.admin.indices.settings.put.UpdateSettingsRequest; import org.opensearch.action.support.master.AcknowledgedResponse; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.constant.CommonName; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.AdminClient; import org.opensearch.client.Client; import org.opensearch.client.IndicesAdminClient; @@ -57,11 +54,15 @@ import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.index.IndexNotFoundException; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; -public class UpdateMappingTests extends AbstractADTest { +public class UpdateMappingTests extends AbstractTimeSeriesTest { private static String resultIndexName; - private AnomalyDetectionIndices adIndices; + private ADIndexManagement adIndices; private ClusterService clusterService; private int numberOfNodes; private AdminClient adminClient; @@ -98,7 +99,7 @@ public void setUp() throws Exception { AnomalyDetectorSettings.AD_RESULT_HISTORY_MAX_DOCS_PER_SHARD, AnomalyDetectorSettings.AD_RESULT_HISTORY_ROLLOVER_PERIOD, AnomalyDetectorSettings.AD_RESULT_HISTORY_RETENTION_PERIOD, - AnomalyDetectorSettings.MAX_PRIMARY_SHARDS + AnomalyDetectorSettings.AD_MAX_PRIMARY_SHARDS ) ) ) @@ -122,25 +123,24 @@ public void setUp() throws Exception { nodeFilter = mock(DiscoveryNodeFilterer.class); numberOfNodes = 2; when(nodeFilter.getNumberOfEligibleDataNodes()).thenReturn(numberOfNodes); - adIndices = new AnomalyDetectionIndices( + adIndices = new ADIndexManagement( client, clusterService, threadPool, settings, nodeFilter, - AnomalyDetectorSettings.MAX_UPDATE_RETRY_TIMES + TimeSeriesSettings.MAX_UPDATE_RETRY_TIMES ); } public void testNoIndexToUpdate() { adIndices.update(); verify(indicesAdminClient, never()).putMapping(any(), any()); - // for an index, we may check doesAliasExists/doesIndexExists and shouldUpdateConcreteIndex - // 1 time for result index since alias does not exist and 2 times for other indices - verify(clusterService, times(9)).state(); + // for an index, we may check doesAliasExists/doesIndexExists + verify(clusterService, times(5)).state(); adIndices.update(); // we will not trigger new check since we have checked all indices before - verify(clusterService, times(9)).state(); + verify(clusterService, times(5)).state(); } @SuppressWarnings({ "serial", "unchecked" }) @@ -165,7 +165,7 @@ public void testUpdateMapping() throws IOException { .numberOfReplicas(0) .putMapping(new MappingMetadata("type", new HashMap() { { - put(AnomalyDetectionIndices.META, new HashMap() { + put(ADIndexManagement.META, new HashMap() { { // version 1 will cause update put(CommonName.SCHEMA_VERSION_FIELD, 1); @@ -298,7 +298,7 @@ public void testFailtoUpdateJobSetting() { } @SuppressWarnings("unchecked") - public void testTooManyUpdate() { + public void testTooManyUpdate() throws IOException { setUpSuccessfulGetJobSetting(); doAnswer(invocation -> { ActionListener listener = (ActionListener) invocation.getArgument(2); @@ -307,7 +307,7 @@ public void testTooManyUpdate() { return null; }).when(indicesAdminClient).updateSettings(any(), any()); - adIndices = new AnomalyDetectionIndices(client, clusterService, threadPool, settings, nodeFilter, 1); + adIndices = new ADIndexManagement(client, clusterService, threadPool, settings, nodeFilter, 1); adIndices.update(); adIndices.update(); diff --git a/src/test/java/org/opensearch/ad/ml/AbstractCosineDataTest.java b/src/test/java/org/opensearch/ad/ml/AbstractCosineDataTest.java index 386ce27bf..1a86e45d4 100644 --- a/src/test/java/org/opensearch/ad/ml/AbstractCosineDataTest.java +++ b/src/test/java/org/opensearch/ad/ml/AbstractCosineDataTest.java @@ -15,9 +15,6 @@ import static org.mockito.Mockito.doAnswer; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.BACKOFF_MINUTES; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE; import java.time.Clock; import java.time.Instant; @@ -32,23 +29,10 @@ import org.opensearch.Version; import org.opensearch.action.get.GetRequest; import org.opensearch.action.get.GetResponse; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.MemoryTracker; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.dataprocessor.IntegerSensitiveSingleFeatureLinearUniformInterpolator; -import org.opensearch.ad.dataprocessor.Interpolator; -import org.opensearch.ad.dataprocessor.LinearUniformInterpolator; -import org.opensearch.ad.dataprocessor.SingleFeatureLinearUniformInterpolator; import org.opensearch.ad.feature.FeatureManager; -import org.opensearch.ad.feature.SearchFeatureDao; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.ad.ratelimit.CheckpointWriteWorker; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.ClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.node.DiscoveryNodeRole; @@ -60,10 +44,23 @@ import org.opensearch.test.ClusterServiceUtils; import org.opensearch.test.OpenSearchTestCase; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.dataprocessor.Imputer; +import org.opensearch.timeseries.dataprocessor.LinearUniformImputer; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.ClientUtil; import com.google.common.collect.ImmutableList; -public class AbstractCosineDataTest extends AbstractADTest { +public class AbstractCosineDataTest extends AbstractTimeSeriesTest { int numMinSamples; String modelId; String entityName; @@ -74,7 +71,7 @@ public class AbstractCosineDataTest extends AbstractADTest { EntityColdStarter entityColdStarter; NodeStateManager stateManager; SearchFeatureDao searchFeatureDao; - Interpolator interpolator; + Imputer imputer; CheckpointDao checkpoint; FeatureManager featureManager; Settings settings; @@ -98,7 +95,7 @@ public class AbstractCosineDataTest extends AbstractADTest { @Override public void setUp() throws Exception { super.setUp(); - numMinSamples = AnomalyDetectorSettings.NUM_MIN_SAMPLES; + numMinSamples = TimeSeriesSettings.NUM_MIN_SAMPLES; clock = mock(Clock.class); when(clock.instant()).thenReturn(Instant.now()); @@ -120,15 +117,15 @@ public void setUp() throws Exception { doAnswer(invocation -> { ActionListener listener = invocation.getArgument(2); - listener.onResponse(TestHelpers.createGetResponse(detector, detectorId, AnomalyDetector.ANOMALY_DETECTORS_INDEX)); + listener.onResponse(TestHelpers.createGetResponse(detector, detectorId, CommonName.CONFIG_INDEX)); return null; }).when(clientUtil).asyncRequest(any(GetRequest.class), any(), any(ActionListener.class)); nodestateSetting = new HashSet<>(ClusterSettings.BUILT_IN_CLUSTER_SETTINGS); - nodestateSetting.add(MAX_RETRY_FOR_UNRESPONSIVE_NODE); - nodestateSetting.add(BACKOFF_MINUTES); - nodestateSetting.add(CHECKPOINT_SAVING_FREQ); + nodestateSetting.add(TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE); + nodestateSetting.add(TimeSeriesSettings.BACKOFF_MINUTES); + nodestateSetting.add(AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ); clusterSettings = new ClusterSettings(Settings.EMPTY, nodestateSetting); discoveryNode = new DiscoveryNode( @@ -147,20 +144,20 @@ public void setUp() throws Exception { settings, clientUtil, clock, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, - clusterService + TimeSeriesSettings.HOURLY_MAINTENANCE, + clusterService, + TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE, + TimeSeriesSettings.BACKOFF_MINUTES ); - SingleFeatureLinearUniformInterpolator singleFeatureLinearUniformInterpolator = - new IntegerSensitiveSingleFeatureLinearUniformInterpolator(); - interpolator = new LinearUniformInterpolator(singleFeatureLinearUniformInterpolator); + imputer = new LinearUniformImputer(true); searchFeatureDao = mock(SearchFeatureDao.class); checkpoint = mock(CheckpointDao.class); featureManager = new FeatureManager( searchFeatureDao, - interpolator, + imputer, clock, AnomalyDetectorSettings.MAX_TRAIN_SAMPLE, AnomalyDetectorSettings.MAX_SAMPLE_STRIDE, @@ -170,9 +167,9 @@ public void setUp() throws Exception { AnomalyDetectorSettings.MAX_IMPUTATION_NEIGHBOR_DISTANCE, AnomalyDetectorSettings.PREVIEW_SAMPLE_RATE, AnomalyDetectorSettings.MAX_PREVIEW_SAMPLES, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, threadPool, - AnomalyDetectorPlugin.AD_THREAD_POOL_NAME + TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME ); checkpointWriteQueue = mock(CheckpointWriteWorker.class); @@ -182,21 +179,21 @@ public void setUp() throws Exception { clock, threadPool, stateManager, - AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE, - AnomalyDetectorSettings.NUM_TREES, - AnomalyDetectorSettings.TIME_DECAY, + TimeSeriesSettings.NUM_SAMPLES_PER_TREE, + TimeSeriesSettings.NUM_TREES, + TimeSeriesSettings.TIME_DECAY, numMinSamples, AnomalyDetectorSettings.MAX_SAMPLE_STRIDE, AnomalyDetectorSettings.MAX_TRAIN_SAMPLE, - interpolator, + imputer, searchFeatureDao, - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE, + TimeSeriesSettings.THRESHOLD_MIN_PVALUE, featureManager, settings, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, checkpointWriteQueue, rcfSeed, - AnomalyDetectorSettings.MAX_COLD_START_ROUNDS + TimeSeriesSettings.MAX_COLD_START_ROUNDS ); detectorId = "123"; @@ -217,14 +214,14 @@ public void setUp() throws Exception { modelManager = new ModelManager( mock(CheckpointDao.class), mock(Clock.class), - AnomalyDetectorSettings.NUM_TREES, - AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE, - AnomalyDetectorSettings.TIME_DECAY, - AnomalyDetectorSettings.NUM_MIN_SAMPLES, - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE, + TimeSeriesSettings.NUM_TREES, + TimeSeriesSettings.NUM_SAMPLES_PER_TREE, + TimeSeriesSettings.TIME_DECAY, + TimeSeriesSettings.NUM_MIN_SAMPLES, + TimeSeriesSettings.THRESHOLD_MIN_PVALUE, AnomalyDetectorSettings.MIN_PREVIEW_SIZE, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ, + TimeSeriesSettings.HOURLY_MAINTENANCE, + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ, entityColdStarter, mock(FeatureManager.class), mock(MemoryTracker.class), diff --git a/src/test/java/org/opensearch/ad/ml/CheckpointDaoTests.java b/src/test/java/org/opensearch/ad/ml/CheckpointDaoTests.java index ab6716b6f..72358af10 100644 --- a/src/test/java/org/opensearch/ad/ml/CheckpointDaoTests.java +++ b/src/test/java/org/opensearch/ad/ml/CheckpointDaoTests.java @@ -23,9 +23,7 @@ import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; import static org.opensearch.action.DocWriteResponse.Result.UPDATED; -import static org.opensearch.ad.ml.CheckpointDao.FIELD_MODEL; import static org.opensearch.ad.ml.CheckpointDao.FIELD_MODELV2; -import static org.opensearch.ad.ml.CheckpointDao.TIMESTAMP; import java.io.BufferedReader; import java.io.File; @@ -95,16 +93,17 @@ import org.opensearch.action.support.replication.ReplicationResponse; import org.opensearch.action.update.UpdateRequest; import org.opensearch.action.update.UpdateResponse; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.indices.AnomalyDetectionIndices; -import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.ClientUtil; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.client.Client; import org.opensearch.core.action.ActionListener; import org.opensearch.core.index.shard.ShardId; import org.opensearch.index.IndexNotFoundException; import org.opensearch.index.engine.VersionConflictEngineException; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.ClientUtil; import com.amazon.randomcutforest.RandomCutForest; import com.amazon.randomcutforest.config.Precision; @@ -144,7 +143,7 @@ public class CheckpointDaoTests extends OpenSearchTestCase { private Clock clock; @Mock - private AnomalyDetectionIndices indexUtil; + private ADIndexManagement indexUtil; private Schema trcfSchema; @@ -195,7 +194,7 @@ public GenericObjectPool run() { return new GenericObjectPool<>(new BasePooledObjectFactory() { @Override public LinkedBuffer create() throws Exception { - return LinkedBuffer.allocate(AnomalyDetectorSettings.SERIALIZATION_BUFFER_BYTES); + return LinkedBuffer.allocate(TimeSeriesSettings.SERIALIZATION_BUFFER_BYTES); } @Override @@ -205,11 +204,11 @@ public PooledObject wrap(LinkedBuffer obj) { }); } })); - serializeRCFBufferPool.setMaxTotal(AnomalyDetectorSettings.MAX_TOTAL_RCF_SERIALIZATION_BUFFERS); - serializeRCFBufferPool.setMaxIdle(AnomalyDetectorSettings.MAX_TOTAL_RCF_SERIALIZATION_BUFFERS); + serializeRCFBufferPool.setMaxTotal(TimeSeriesSettings.MAX_TOTAL_RCF_SERIALIZATION_BUFFERS); + serializeRCFBufferPool.setMaxIdle(TimeSeriesSettings.MAX_TOTAL_RCF_SERIALIZATION_BUFFERS); serializeRCFBufferPool.setMinIdle(0); serializeRCFBufferPool.setBlockWhenExhausted(false); - serializeRCFBufferPool.setTimeBetweenEvictionRuns(AnomalyDetectorSettings.HOURLY_MAINTENANCE); + serializeRCFBufferPool.setTimeBetweenEvictionRuns(TimeSeriesSettings.HOURLY_MAINTENANCE); anomalyRate = 0.005; checkpointDao = new CheckpointDao( @@ -225,7 +224,7 @@ public PooledObject wrap(LinkedBuffer obj) { indexUtil, maxCheckpointBytes, serializeRCFBufferPool, - AnomalyDetectorSettings.SERIALIZATION_BUFFER_BYTES, + TimeSeriesSettings.SERIALIZATION_BUFFER_BYTES, anomalyRate ); @@ -285,10 +284,10 @@ private void verifyPutModelCheckpointAsync() { assertEquals(indexName, updateRequest.index()); assertEquals(modelId, updateRequest.id()); IndexRequest indexRequest = updateRequest.doc(); - Set expectedSourceKeys = new HashSet(Arrays.asList(FIELD_MODELV2, CheckpointDao.TIMESTAMP)); + Set expectedSourceKeys = new HashSet(Arrays.asList(FIELD_MODELV2, CommonName.TIMESTAMP)); assertEquals(expectedSourceKeys, indexRequest.sourceAsMap().keySet()); assertTrue(!((String) (indexRequest.sourceAsMap().get(FIELD_MODELV2))).isEmpty()); - assertNotNull(indexRequest.sourceAsMap().get(CheckpointDao.TIMESTAMP)); + assertNotNull(indexRequest.sourceAsMap().get(CommonName.TIMESTAMP)); ArgumentCaptor responseCaptor = ArgumentCaptor.forClass(Void.class); verify(listener).onResponse(responseCaptor.capture()); @@ -305,7 +304,7 @@ public void test_putModelCheckpoint_callListener_no_checkpoint_index() { doAnswer(invocation -> { ActionListener listener = invocation.getArgument(0); - listener.onResponse(new CreateIndexResponse(true, true, CommonName.CHECKPOINT_INDEX_NAME)); + listener.onResponse(new CreateIndexResponse(true, true, ADCommonName.CHECKPOINT_INDEX_NAME)); return null; }).when(indexUtil).initCheckpointIndex(any()); @@ -317,7 +316,7 @@ public void test_putModelCheckpoint_callListener_race_condition() { doAnswer(invocation -> { ActionListener listener = invocation.getArgument(0); - listener.onFailure(new ResourceAlreadyExistsException(CommonName.CHECKPOINT_INDEX_NAME)); + listener.onFailure(new ResourceAlreadyExistsException(ADCommonName.CHECKPOINT_INDEX_NAME)); return null; }).when(indexUtil).initCheckpointIndex(any()); @@ -345,7 +344,7 @@ public void test_getModelCheckpoint_returnExpectedToListener() { // ArgumentCaptor requestCaptor = ArgumentCaptor.forClass(GetRequest.class); UpdateResponse updateResponse = new UpdateResponse( new ReplicationResponse.ShardInfo(3, 2), - new ShardId(CommonName.CHECKPOINT_INDEX_NAME, "uuid", 2), + new ShardId(ADCommonName.CHECKPOINT_INDEX_NAME, "uuid", 2), "1", 7, 17, @@ -399,7 +398,7 @@ public void test_getModelCheckpoint_Bwc() { // ArgumentCaptor requestCaptor = ArgumentCaptor.forClass(GetRequest.class); UpdateResponse updateResponse = new UpdateResponse( new ReplicationResponse.ShardInfo(3, 2), - new ShardId(CommonName.CHECKPOINT_INDEX_NAME, "uuid", 2), + new ShardId(ADCommonName.CHECKPOINT_INDEX_NAME, "uuid", 2), "1", 7, 17, @@ -503,9 +502,9 @@ public void test_restore() throws IOException { GetResponse getResponse = mock(GetResponse.class); when(getResponse.isExists()).thenReturn(true); Map source = new HashMap<>(); - source.put(CheckpointDao.DETECTOR_ID, state.getDetectorId()); + source.put(CheckpointDao.DETECTOR_ID, state.getId()); source.put(CheckpointDao.FIELD_MODELV2, checkpointDao.toCheckpoint(modelToSave, modelId).get()); - source.put(CheckpointDao.TIMESTAMP, "2020-10-11T22:58:23.610392Z"); + source.put(CommonName.TIMESTAMP, "2020-10-11T22:58:23.610392Z"); when(getResponse.getSource()).thenReturn(source); doAnswer(invocation -> { @@ -548,7 +547,7 @@ public void test_batch_write_no_index() { doAnswer(invocation -> { ActionListener listener = invocation.getArgument(0); - listener.onResponse(new CreateIndexResponse(true, true, CommonName.CHECKPOINT_INDEX_NAME)); + listener.onResponse(new CreateIndexResponse(true, true, ADCommonName.CHECKPOINT_INDEX_NAME)); return null; }).when(indexUtil).initCheckpointIndex(any()); checkpointDao.batchWrite(new BulkRequest(), null); @@ -560,7 +559,7 @@ public void test_batch_write_index_init_no_ack() throws InterruptedException { doAnswer(invocation -> { ActionListener listener = invocation.getArgument(0); - listener.onResponse(new CreateIndexResponse(false, false, CommonName.CHECKPOINT_INDEX_NAME)); + listener.onResponse(new CreateIndexResponse(false, false, ADCommonName.CHECKPOINT_INDEX_NAME)); return null; }).when(indexUtil).initCheckpointIndex(any()); @@ -607,14 +606,14 @@ public void test_batch_write_init_exception() throws InterruptedException { private BulkResponse createBulkResponse(int succeeded, int failed, String[] failedId) { BulkItemResponse[] bulkItemResponses = new BulkItemResponse[succeeded + failed]; - ShardId shardId = new ShardId(CommonName.CHECKPOINT_INDEX_NAME, "", 1); + ShardId shardId = new ShardId(ADCommonName.CHECKPOINT_INDEX_NAME, "", 1); int i = 0; for (; i < failed; i++) { bulkItemResponses[i] = new BulkItemResponse( i, DocWriteRequest.OpType.UPDATE, new BulkItemResponse.Failure( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, failedId[i], new VersionConflictEngineException(shardId, "id", "test") ) @@ -661,9 +660,9 @@ public void test_batch_read() throws InterruptedException { items[0] = new MultiGetItemResponse( null, new MultiGetResponse.Failure( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, "modelId", - new IndexNotFoundException(CommonName.CHECKPOINT_INDEX_NAME) + new IndexNotFoundException(ADCommonName.CHECKPOINT_INDEX_NAME) ) ); listener.onResponse(new MultiGetResponse(items)); @@ -693,7 +692,7 @@ public void test_too_large_checkpoint() throws IOException { indexUtil, 1, // make the max checkpoint size 1 byte only serializeRCFBufferPool, - AnomalyDetectorSettings.SERIALIZATION_BUFFER_BYTES, + TimeSeriesSettings.SERIALIZATION_BUFFER_BYTES, anomalyRate ); @@ -730,7 +729,7 @@ public void testBorrowFromPoolFailure() throws Exception { indexUtil, 1, // make the max checkpoint size 1 byte only mockSerializeRCFBufferPool, - AnomalyDetectorSettings.SERIALIZATION_BUFFER_BYTES, + TimeSeriesSettings.SERIALIZATION_BUFFER_BYTES, anomalyRate ); @@ -755,7 +754,7 @@ public void testMapperFailure() throws IOException { indexUtil, 1, // make the max checkpoint size 1 byte only serializeRCFBufferPool, - AnomalyDetectorSettings.SERIALIZATION_BUFFER_BYTES, + TimeSeriesSettings.SERIALIZATION_BUFFER_BYTES, anomalyRate ); @@ -763,7 +762,7 @@ public void testMapperFailure() throws IOException { ModelState state = MLUtil.randomModelState(new RandomModelStateConfig.Builder().fullModel(true).sampleSize(1).build()); String json = checkpointDao.toCheckpoint(state.getModel(), modelId).get(); assertEquals(null, JsonDeserializer.getChildNode(json, CheckpointDao.ENTITY_TRCF)); - assertTrue(null != JsonDeserializer.getChildNode(json, CheckpointDao.ENTITY_SAMPLE)); + assertTrue(null != JsonDeserializer.getChildNode(json, CommonName.ENTITY_SAMPLE)); // assertTrue(null != JsonDeserializer.getChildNode(json, CheckpointDao.ENTITY_THRESHOLD)); // assertNotNull(JsonDeserializer.getChildNode(json, CheckpointDao.ENTITY_TRCF)); } @@ -772,7 +771,7 @@ public void testEmptySample() throws IOException { ModelState state = MLUtil.randomModelState(new RandomModelStateConfig.Builder().fullModel(true).sampleSize(0).build()); String json = checkpointDao.toCheckpoint(state.getModel(), modelId).get(); // assertTrue(null != JsonDeserializer.getChildNode(json, CheckpointDao.ENTITY_TRCF)); - assertEquals(null, JsonDeserializer.getChildNode(json, CheckpointDao.ENTITY_SAMPLE)); + assertEquals(null, JsonDeserializer.getChildNode(json, CommonName.ENTITY_SAMPLE)); // assertTrue(null != JsonDeserializer.getChildNode(json, CheckpointDao.ENTITY_THRESHOLD)); assertNotNull(JsonDeserializer.getChildNode(json, CheckpointDao.ENTITY_TRCF)); } @@ -803,7 +802,7 @@ private void setUpMockTrcf() { indexUtil, maxCheckpointBytes, serializeRCFBufferPool, - AnomalyDetectorSettings.SERIALIZATION_BUFFER_BYTES, + TimeSeriesSettings.SERIALIZATION_BUFFER_BYTES, anomalyRate ); } @@ -846,7 +845,7 @@ public void testFromEntityModelCheckpointWithTrcf() throws Exception { Map entity = new HashMap<>(); entity.put(FIELD_MODELV2, model); - entity.put(TIMESTAMP, Instant.now().toString()); + entity.put(CommonName.TIMESTAMP, Instant.now().toString()); Optional> result = checkpointDao.fromEntityModelCheckpoint(entity, this.modelId); assertTrue(result.isPresent()); @@ -863,7 +862,7 @@ public void testFromEntityModelCheckpointTrcfMapperFail() throws Exception { Map entity = new HashMap<>(); entity.put(FIELD_MODELV2, model); - entity.put(TIMESTAMP, Instant.now().toString()); + entity.put(CommonName.TIMESTAMP, Instant.now().toString()); Optional> result = checkpointDao.fromEntityModelCheckpoint(entity, this.modelId); assertTrue(result.isPresent()); @@ -888,8 +887,8 @@ private Pair, Instant> setUp1_0Model(String checkpointFileNa Instant now = Instant.now(); Map entity = new HashMap<>(); - entity.put(FIELD_MODEL, model); - entity.put(TIMESTAMP, now.toString()); + entity.put(CommonName.FIELD_MODEL, model); + entity.put(CommonName.TIMESTAMP, now.toString()); return Pair.of(entity, now); } @@ -940,7 +939,7 @@ public void testFromEntityModelCheckpointModelTooLarge() throws FileNotFoundExce indexUtil, 100_000, // checkpoint_2.json is of 224603 bytes. serializeRCFBufferPool, - AnomalyDetectorSettings.SERIALIZATION_BUFFER_BYTES, + TimeSeriesSettings.SERIALIZATION_BUFFER_BYTES, anomalyRate ); Optional> result = checkpointDao.fromEntityModelCheckpoint(modelPair.getLeft(), this.modelId); @@ -951,7 +950,7 @@ public void testFromEntityModelCheckpointModelTooLarge() throws FileNotFoundExce // test no model is present in checkpoint public void testFromEntityModelCheckpointEmptyModel() throws FileNotFoundException, IOException, URISyntaxException { Map entity = new HashMap<>(); - entity.put(TIMESTAMP, Instant.now().toString()); + entity.put(CommonName.TIMESTAMP, Instant.now().toString()); Optional> result = checkpointDao.fromEntityModelCheckpoint(entity, this.modelId); assertTrue(!result.isPresent()); @@ -989,7 +988,7 @@ public void testFromEntityModelCheckpointWithEntity() throws Exception { .randomModelState(new RandomModelStateConfig.Builder().fullModel(true).entityAttributes(true).build()); Map content = checkpointDao.toIndexSource(state); // Opensearch will convert from java.time.ZonedDateTime to String. Here I am converting to simulate that - content.put(TIMESTAMP, "2021-09-23T05:00:37.93195Z"); + content.put(CommonName.TIMESTAMP, "2021-09-23T05:00:37.93195Z"); Optional> result = checkpointDao.fromEntityModelCheckpoint(content, this.modelId); diff --git a/src/test/java/org/opensearch/ad/ml/CheckpointDeleteTests.java b/src/test/java/org/opensearch/ad/ml/CheckpointDeleteTests.java index b154588e5..dcda1ff92 100644 --- a/src/test/java/org/opensearch/ad/ml/CheckpointDeleteTests.java +++ b/src/test/java/org/opensearch/ad/ml/CheckpointDeleteTests.java @@ -26,16 +26,16 @@ import org.junit.Before; import org.mockito.Mock; import org.opensearch.OpenSearchException; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.indices.AnomalyDetectionIndices; -import org.opensearch.ad.util.ClientUtil; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.client.Client; import org.opensearch.core.action.ActionListener; import org.opensearch.index.IndexNotFoundException; import org.opensearch.index.reindex.BulkByScrollResponse; import org.opensearch.index.reindex.DeleteByQueryAction; import org.opensearch.index.reindex.ScrollableHitSource; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.util.ClientUtil; import com.amazon.randomcutforest.parkservices.state.ThresholdedRandomCutForestMapper; import com.amazon.randomcutforest.parkservices.state.ThresholdedRandomCutForestState; @@ -52,7 +52,7 @@ * class for tests requiring checking logs. * */ -public class CheckpointDeleteTests extends AbstractADTest { +public class CheckpointDeleteTests extends AbstractTimeSeriesTest { private enum DeleteExecutionMode { NORMAL, INDEX_NOT_FOUND, @@ -64,7 +64,7 @@ private enum DeleteExecutionMode { private Client client; private ClientUtil clientUtil; private Gson gson; - private AnomalyDetectionIndices indexUtil; + private ADIndexManagement indexUtil; private String detectorId; private int maxCheckpointBytes; private GenericObjectPool objectPool; @@ -87,7 +87,7 @@ public void setUp() throws Exception { client = mock(Client.class); clientUtil = mock(ClientUtil.class); gson = null; - indexUtil = mock(AnomalyDetectionIndices.class); + indexUtil = mock(ADIndexManagement.class); detectorId = "123"; maxCheckpointBytes = 1_000_000; @@ -100,7 +100,7 @@ public void setUp() throws Exception { checkpointDao = new CheckpointDao( client, clientUtil, - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, gson, mapper, converter, @@ -140,7 +140,7 @@ public void delete_by_detector_id_template(DeleteExecutionMode mode) { assertTrue(listener != null); if (mode == DeleteExecutionMode.INDEX_NOT_FOUND) { - listener.onFailure(new IndexNotFoundException(CommonName.CHECKPOINT_INDEX_NAME)); + listener.onFailure(new IndexNotFoundException(ADCommonName.CHECKPOINT_INDEX_NAME)); } else if (mode == DeleteExecutionMode.FAILURE) { listener.onFailure(new OpenSearchException("")); } else { diff --git a/src/test/java/org/opensearch/ad/ml/EntityColdStarterTests.java b/src/test/java/org/opensearch/ad/ml/EntityColdStarterTests.java index abbdb5c86..188146f69 100644 --- a/src/test/java/org/opensearch/ad/ml/EntityColdStarterTests.java +++ b/src/test/java/org/opensearch/ad/ml/EntityColdStarterTests.java @@ -13,6 +13,7 @@ import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.anyBoolean; +import static org.mockito.ArgumentMatchers.eq; import static org.mockito.Mockito.doAnswer; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.never; @@ -42,15 +43,10 @@ import org.junit.BeforeClass; import org.opensearch.action.get.GetRequest; import org.opensearch.action.get.GetResponse; -import org.opensearch.ad.MemoryTracker; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.common.exception.AnomalyDetectionException; import org.opensearch.ad.feature.FeatureManager; import org.opensearch.ad.ml.ModelManager.ModelType; -import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.IntervalTimeConfiguration; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.settings.EnabledSetting; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.collect.Tuple; import org.opensearch.common.settings.ClusterSettings; @@ -58,6 +54,13 @@ import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.core.concurrency.OpenSearchRejectedExecutionException; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import com.amazon.randomcutforest.config.Precision; import com.amazon.randomcutforest.config.TransformMethod; @@ -75,28 +78,28 @@ public class EntityColdStarterTests extends AbstractCosineDataTest { public static void initOnce() { ClusterService clusterService = mock(ClusterService.class); - Set> settingSet = EnabledSetting.settings.values().stream().collect(Collectors.toSet()); + Set> settingSet = ADEnabledSetting.settings.values().stream().collect(Collectors.toSet()); when(clusterService.getClusterSettings()).thenReturn(new ClusterSettings(Settings.EMPTY, settingSet)); - EnabledSetting.getInstance().init(clusterService); + ADEnabledSetting.getInstance().init(clusterService); } @AfterClass public static void clearOnce() { // restore to default value - EnabledSetting.getInstance().setSettingValue(EnabledSetting.INTERPOLATION_IN_HCAD_COLD_START_ENABLED, false); + ADEnabledSetting.getInstance().setSettingValue(ADEnabledSetting.INTERPOLATION_IN_HCAD_COLD_START_ENABLED, false); } @Override public void setUp() throws Exception { super.setUp(); - EnabledSetting.getInstance().setSettingValue(EnabledSetting.INTERPOLATION_IN_HCAD_COLD_START_ENABLED, Boolean.TRUE); + ADEnabledSetting.getInstance().setSettingValue(ADEnabledSetting.INTERPOLATION_IN_HCAD_COLD_START_ENABLED, Boolean.TRUE); } @Override public void tearDown() throws Exception { - EnabledSetting.getInstance().setSettingValue(EnabledSetting.INTERPOLATION_IN_HCAD_COLD_START_ENABLED, Boolean.FALSE); + ADEnabledSetting.getInstance().setSettingValue(ADEnabledSetting.INTERPOLATION_IN_HCAD_COLD_START_ENABLED, Boolean.FALSE); super.tearDown(); } @@ -119,10 +122,10 @@ public void testColdStart() throws InterruptedException, IOException { modelState = new ModelState<>(model, modelId, detectorId, ModelType.ENTITY.getName(), clock, priority); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(2); + ActionListener> listener = invocation.getArgument(3); listener.onResponse(Optional.of(1602269260000L)); return null; - }).when(searchFeatureDao).getEntityMinDataTime(any(), any(), any()); + }).when(searchFeatureDao).getMinDataTime(any(), any(), eq(AnalysisType.AD), any()); List> coldStartSamples = new ArrayList<>(); @@ -134,10 +137,10 @@ public void testColdStart() throws InterruptedException, IOException { coldStartSamples.add(Optional.of(sample2)); coldStartSamples.add(Optional.of(sample3)); doAnswer(invocation -> { - ActionListener>> listener = invocation.getArgument(4); + ActionListener>> listener = invocation.getArgument(5); listener.onResponse(coldStartSamples); return null; - }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), any()); + }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), eq(AnalysisType.AD), any()); entityColdStarter.trainModel(entity, detectorId, modelState, listener); checkSemaphoreRelease(); @@ -167,9 +170,9 @@ public void testColdStart() throws InterruptedException, IOException { // for function interpolate: // 1st parameter is a matrix of size numFeatures * numSamples // 2nd parameter is the number of interpolants including two samples - double[][] interval1 = interpolator.interpolate(new double[][] { new double[] { sample1[0], sample2[0] } }, 61); + double[][] interval1 = imputer.impute(new double[][] { new double[] { sample1[0], sample2[0] } }, 61); expectedColdStartData.addAll(convertToFeatures(interval1, 60)); - double[][] interval2 = interpolator.interpolate(new double[][] { new double[] { sample2[0], sample3[0] } }, 61); + double[][] interval2 = imputer.impute(new double[][] { new double[] { sample2[0], sample3[0] } }, 61); expectedColdStartData.addAll(convertToFeatures(interval2, 61)); assertEquals(121, expectedColdStartData.size()); @@ -183,14 +186,14 @@ public void testMissMin() throws IOException, InterruptedException { modelState = new ModelState<>(model, modelId, detectorId, ModelType.ENTITY.getName(), clock, priority); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(2); + ActionListener> listener = invocation.getArgument(3); listener.onResponse(Optional.empty()); return null; - }).when(searchFeatureDao).getEntityMinDataTime(any(), any(), any()); + }).when(searchFeatureDao).getMinDataTime(any(), any(), eq(AnalysisType.AD), any()); entityColdStarter.trainModel(entity, detectorId, modelState, listener); - verify(searchFeatureDao, never()).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), any()); + verify(searchFeatureDao, never()).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), eq(AnalysisType.AD), any()); assertTrue(!model.getTrcf().isPresent()); checkSemaphoreRelease(); @@ -210,16 +213,16 @@ private void diffTesting(ModelState modelState, List cold .dimensions(inputDimension * detector.getShingleSize()) .precision(Precision.FLOAT_32) .randomSeed(rcfSeed) - .numberOfTrees(AnomalyDetectorSettings.NUM_TREES) + .numberOfTrees(TimeSeriesSettings.NUM_TREES) .shingleSize(detector.getShingleSize()) - .boundingBoxCacheFraction(AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) - .timeDecay(AnomalyDetectorSettings.TIME_DECAY) + .boundingBoxCacheFraction(TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) + .timeDecay(TimeSeriesSettings.TIME_DECAY) .outputAfter(numMinSamples) .initialAcceptFraction(0.125d) .parallelExecutionEnabled(false) - .sampleSize(AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE) + .sampleSize(TimeSeriesSettings.NUM_SAMPLES_PER_TREE) .internalShinglingEnabled(true) - .anomalyRate(1 - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE) + .anomalyRate(1 - TimeSeriesSettings.THRESHOLD_MIN_PVALUE) .transformMethod(TransformMethod.NORMALIZE) .alertOnce(true) .autoAdjust(true) @@ -271,10 +274,10 @@ public void testTwoSegmentsWithSingleSample() throws InterruptedException, IOExc modelState = new ModelState<>(model, modelId, detectorId, ModelType.ENTITY.getName(), clock, priority); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(2); + ActionListener> listener = invocation.getArgument(3); listener.onResponse(Optional.of(1602269260000L)); return null; - }).when(searchFeatureDao).getEntityMinDataTime(any(), any(), any()); + }).when(searchFeatureDao).getMinDataTime(any(), any(), eq(AnalysisType.AD), any()); List> coldStartSamples = new ArrayList<>(); double[] sample1 = new double[] { 57.0 }; @@ -287,10 +290,10 @@ public void testTwoSegmentsWithSingleSample() throws InterruptedException, IOExc coldStartSamples.add(Optional.empty()); coldStartSamples.add(Optional.of(sample5)); doAnswer(invocation -> { - ActionListener>> listener = invocation.getArgument(4); + ActionListener>> listener = invocation.getArgument(5); listener.onResponse(coldStartSamples); return null; - }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), any()); + }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), eq(AnalysisType.AD), any()); entityColdStarter.trainModel(entity, detectorId, modelState, listener); checkSemaphoreRelease(); @@ -306,11 +309,11 @@ public void testTwoSegmentsWithSingleSample() throws InterruptedException, IOExc // for function interpolate: // 1st parameter is a matrix of size numFeatures * numSamples // 2nd parameter is the number of interpolants including two samples - double[][] interval1 = interpolator.interpolate(new double[][] { new double[] { sample1[0], sample2[0] } }, 61); + double[][] interval1 = imputer.impute(new double[][] { new double[] { sample1[0], sample2[0] } }, 61); expectedColdStartData.addAll(convertToFeatures(interval1, 60)); - double[][] interval2 = interpolator.interpolate(new double[][] { new double[] { sample2[0], sample3[0] } }, 61); + double[][] interval2 = imputer.impute(new double[][] { new double[] { sample2[0], sample3[0] } }, 61); expectedColdStartData.addAll(convertToFeatures(interval2, 60)); - double[][] interval3 = interpolator.interpolate(new double[][] { new double[] { sample3[0], sample5[0] } }, 121); + double[][] interval3 = imputer.impute(new double[][] { new double[] { sample3[0], sample5[0] } }, 121); expectedColdStartData.addAll(convertToFeatures(interval3, 121)); assertTrue("size: " + model.getSamples().size(), model.getSamples().isEmpty()); assertEquals(241, expectedColdStartData.size()); @@ -325,10 +328,10 @@ public void testTwoSegments() throws InterruptedException, IOException { modelState = new ModelState<>(model, modelId, detectorId, ModelType.ENTITY.getName(), clock, priority); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(2); + ActionListener> listener = invocation.getArgument(3); listener.onResponse(Optional.of(1602269260000L)); return null; - }).when(searchFeatureDao).getEntityMinDataTime(any(), any(), any()); + }).when(searchFeatureDao).getMinDataTime(any(), any(), eq(AnalysisType.AD), any()); List> coldStartSamples = new ArrayList<>(); double[] sample1 = new double[] { 57.0 }; @@ -343,10 +346,10 @@ public void testTwoSegments() throws InterruptedException, IOException { coldStartSamples.add(Optional.of(new double[] { -17.0 })); coldStartSamples.add(Optional.of(new double[] { -38.0 })); doAnswer(invocation -> { - ActionListener>> listener = invocation.getArgument(4); + ActionListener>> listener = invocation.getArgument(5); listener.onResponse(coldStartSamples); return null; - }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), any()); + }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), eq(AnalysisType.AD), any()); entityColdStarter.trainModel(entity, detectorId, modelState, listener); checkSemaphoreRelease(); @@ -362,13 +365,13 @@ public void testTwoSegments() throws InterruptedException, IOException { // for function interpolate: // 1st parameter is a matrix of size numFeatures * numSamples // 2nd parameter is the number of interpolants including two samples - double[][] interval1 = interpolator.interpolate(new double[][] { new double[] { sample1[0], sample2[0] } }, 61); + double[][] interval1 = imputer.impute(new double[][] { new double[] { sample1[0], sample2[0] } }, 61); expectedColdStartData.addAll(convertToFeatures(interval1, 60)); - double[][] interval2 = interpolator.interpolate(new double[][] { new double[] { sample2[0], sample3[0] } }, 61); + double[][] interval2 = imputer.impute(new double[][] { new double[] { sample2[0], sample3[0] } }, 61); expectedColdStartData.addAll(convertToFeatures(interval2, 60)); - double[][] interval3 = interpolator.interpolate(new double[][] { new double[] { sample3[0], sample5[0] } }, 121); + double[][] interval3 = imputer.impute(new double[][] { new double[] { sample3[0], sample5[0] } }, 121); expectedColdStartData.addAll(convertToFeatures(interval3, 120)); - double[][] interval4 = interpolator.interpolate(new double[][] { new double[] { sample5[0], sample6[0] } }, 61); + double[][] interval4 = imputer.impute(new double[][] { new double[] { sample5[0], sample6[0] } }, 61); expectedColdStartData.addAll(convertToFeatures(interval4, 61)); assertEquals(301, expectedColdStartData.size()); assertTrue("size: " + model.getSamples().size(), model.getSamples().isEmpty()); @@ -381,17 +384,17 @@ public void testThrottledColdStart() throws InterruptedException { modelState = new ModelState<>(model, modelId, detectorId, ModelType.ENTITY.getName(), clock, priority); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(2); + ActionListener> listener = invocation.getArgument(3); listener.onFailure(new OpenSearchRejectedExecutionException("")); return null; - }).when(searchFeatureDao).getEntityMinDataTime(any(), any(), any()); + }).when(searchFeatureDao).getMinDataTime(any(), any(), eq(AnalysisType.AD), any()); entityColdStarter.trainModel(entity, detectorId, modelState, listener); entityColdStarter.trainModel(entity, "456", modelState, listener); // only the first one makes the call - verify(searchFeatureDao, times(1)).getEntityMinDataTime(any(), any(), any()); + verify(searchFeatureDao, times(1)).getMinDataTime(any(), any(), eq(AnalysisType.AD), any()); checkSemaphoreRelease(); } @@ -401,14 +404,14 @@ public void testColdStartException() throws InterruptedException { modelState = new ModelState<>(model, modelId, detectorId, ModelType.ENTITY.getName(), clock, priority); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(2); - listener.onFailure(new AnomalyDetectionException(detectorId, "")); + ActionListener> listener = invocation.getArgument(3); + listener.onFailure(new TimeSeriesException(detectorId, "")); return null; - }).when(searchFeatureDao).getEntityMinDataTime(any(), any(), any()); + }).when(searchFeatureDao).getMinDataTime(any(), any(), eq(AnalysisType.AD), any()); entityColdStarter.trainModel(entity, detectorId, modelState, listener); - assertTrue(stateManager.getLastDetectionError(detectorId) != null); + assertTrue(stateManager.fetchExceptionAndClear(detectorId).isPresent()); checkSemaphoreRelease(); } @@ -427,24 +430,24 @@ public void testNotEnoughSamples() throws InterruptedException, IOException { GetRequest request = invocation.getArgument(0); ActionListener listener = invocation.getArgument(2); - listener.onResponse(TestHelpers.createGetResponse(detector, detectorId, AnomalyDetector.ANOMALY_DETECTORS_INDEX)); + listener.onResponse(TestHelpers.createGetResponse(detector, detectorId, CommonName.CONFIG_INDEX)); return null; }).when(clientUtil).asyncRequest(any(GetRequest.class), any(), any(ActionListener.class)); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(2); + ActionListener> listener = invocation.getArgument(3); listener.onResponse(Optional.of(1602269260000L)); return null; - }).when(searchFeatureDao).getEntityMinDataTime(any(), any(), any()); + }).when(searchFeatureDao).getMinDataTime(any(), any(), eq(AnalysisType.AD), any()); List> coldStartSamples = new ArrayList<>(); coldStartSamples.add(Optional.of(new double[] { 57.0 })); coldStartSamples.add(Optional.of(new double[] { 1.0 })); doAnswer(invocation -> { - ActionListener>> listener = invocation.getArgument(4); + ActionListener>> listener = invocation.getArgument(5); listener.onResponse(coldStartSamples); return null; - }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), any()); + }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), eq(AnalysisType.AD), any()); entityColdStarter.trainModel(entity, detectorId, modelState, listener); checkSemaphoreRelease(); @@ -480,15 +483,15 @@ public void testEmptyDataRange() throws InterruptedException { GetRequest request = invocation.getArgument(0); ActionListener listener = invocation.getArgument(2); - listener.onResponse(TestHelpers.createGetResponse(detector, detector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX)); + listener.onResponse(TestHelpers.createGetResponse(detector, detector.getId(), CommonName.CONFIG_INDEX)); return null; }).when(clientUtil).asyncRequest(any(GetRequest.class), any(), any(ActionListener.class)); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(2); + ActionListener> listener = invocation.getArgument(3); listener.onResponse(Optional.of(894056973000L)); return null; - }).when(searchFeatureDao).getEntityMinDataTime(any(), any(), any()); + }).when(searchFeatureDao).getMinDataTime(any(), any(), eq(AnalysisType.AD), any()); entityColdStarter.trainModel(entity, detectorId, modelState, listener); checkSemaphoreRelease(); @@ -508,16 +511,16 @@ public void testTrainModelFromExistingSamplesEnoughSamples() { .dimensions(dimensions) .precision(Precision.FLOAT_32) .randomSeed(rcfSeed) - .numberOfTrees(AnomalyDetectorSettings.NUM_TREES) + .numberOfTrees(TimeSeriesSettings.NUM_TREES) .shingleSize(detector.getShingleSize()) - .boundingBoxCacheFraction(AnomalyDetectorSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) - .timeDecay(AnomalyDetectorSettings.TIME_DECAY) + .boundingBoxCacheFraction(TimeSeriesSettings.REAL_TIME_BOUNDING_BOX_CACHE_RATIO) + .timeDecay(TimeSeriesSettings.TIME_DECAY) .outputAfter(numMinSamples) .initialAcceptFraction(0.125d) .parallelExecutionEnabled(false) - .sampleSize(AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE) + .sampleSize(TimeSeriesSettings.NUM_SAMPLES_PER_TREE) .internalShinglingEnabled(true) - .anomalyRate(1 - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE) + .anomalyRate(1 - TimeSeriesSettings.THRESHOLD_MIN_PVALUE) .transformMethod(TransformMethod.NORMALIZE) .alertOnce(true) .autoAdjust(true); @@ -552,7 +555,7 @@ public void testTrainModelFromExistingSamplesNotEnoughSamples() { @SuppressWarnings("unchecked") private void accuracyTemplate(int detectorIntervalMins, float precisionThreshold, float recallThreshold) throws Exception { int baseDimension = 2; - int dataSize = 20 * AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE; + int dataSize = 20 * TimeSeriesSettings.NUM_SAMPLES_PER_TREE; int trainTestSplit = 300; // detector interval int interval = detectorIntervalMins; @@ -567,7 +570,7 @@ private void accuracyTemplate(int detectorIntervalMins, float precisionThreshold .newInstance() .setDetectionInterval(new IntervalTimeConfiguration(interval, ChronoUnit.MINUTES)) .setCategoryFields(ImmutableList.of(randomAlphaOfLength(5))) - .setShingleSize(AnomalyDetectorSettings.DEFAULT_SHINGLE_SIZE) + .setShingleSize(TimeSeriesSettings.DEFAULT_SHINGLE_SIZE) .build(); long seed = new Random().nextLong(); @@ -595,16 +598,15 @@ private void accuracyTemplate(int detectorIntervalMins, float precisionThreshold GetRequest request = invocation.getArgument(0); ActionListener listener = invocation.getArgument(2); - listener - .onResponse(TestHelpers.createGetResponse(detector, detector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX)); + listener.onResponse(TestHelpers.createGetResponse(detector, detector.getId(), CommonName.CONFIG_INDEX)); return null; }).when(clientUtil).asyncRequest(any(GetRequest.class), any(), any(ActionListener.class)); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(2); + ActionListener> listener = invocation.getArgument(3); listener.onResponse(Optional.of(timestamps[0])); return null; - }).when(searchFeatureDao).getEntityMinDataTime(any(), any(), any()); + }).when(searchFeatureDao).getMinDataTime(any(), any(), eq(AnalysisType.AD), any()); doAnswer(invocation -> { List> ranges = invocation.getArgument(1); @@ -623,13 +625,13 @@ public int compare(Entry p1, Entry p2) { coldStartSamples.add(Optional.of(data[valueIndex])); } - ActionListener>> listener = invocation.getArgument(4); + ActionListener>> listener = invocation.getArgument(5); listener.onResponse(coldStartSamples); return null; - }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), any()); + }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), eq(AnalysisType.AD), any()); EntityModel model = new EntityModel(entity, new ArrayDeque<>(), null); - modelState = new ModelState<>(model, modelId, detector.getDetectorId(), ModelType.ENTITY.getName(), clock, priority); + modelState = new ModelState<>(model, modelId, detector.getId(), ModelType.ENTITY.getName(), clock, priority); released = new AtomicBoolean(); @@ -639,7 +641,7 @@ public int compare(Entry p1, Entry p2) { inProgressLatch.countDown(); }); - entityColdStarter.trainModel(entity, detector.getDetectorId(), modelState, listener); + entityColdStarter.trainModel(entity, detector.getId(), modelState, listener); checkSemaphoreRelease(); assertTrue(model.getTrcf().isPresent()); @@ -697,40 +699,40 @@ public void testAccuracyThirteenMinuteInterval() throws Exception { } public void testAccuracyOneMinuteIntervalNoInterpolation() throws Exception { - EnabledSetting.getInstance().setSettingValue(EnabledSetting.INTERPOLATION_IN_HCAD_COLD_START_ENABLED, false); + ADEnabledSetting.getInstance().setSettingValue(ADEnabledSetting.INTERPOLATION_IN_HCAD_COLD_START_ENABLED, false); // for one minute interval, we need to disable interpolation to achieve good results entityColdStarter = new EntityColdStarter( clock, threadPool, stateManager, - AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE, - AnomalyDetectorSettings.NUM_TREES, - AnomalyDetectorSettings.TIME_DECAY, + TimeSeriesSettings.NUM_SAMPLES_PER_TREE, + TimeSeriesSettings.NUM_TREES, + TimeSeriesSettings.TIME_DECAY, numMinSamples, AnomalyDetectorSettings.MAX_SAMPLE_STRIDE, AnomalyDetectorSettings.MAX_TRAIN_SAMPLE, - interpolator, + imputer, searchFeatureDao, - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE, + TimeSeriesSettings.THRESHOLD_MIN_PVALUE, featureManager, settings, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, checkpointWriteQueue, rcfSeed, - AnomalyDetectorSettings.MAX_COLD_START_ROUNDS + TimeSeriesSettings.MAX_COLD_START_ROUNDS ); modelManager = new ModelManager( mock(CheckpointDao.class), mock(Clock.class), - AnomalyDetectorSettings.NUM_TREES, - AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE, - AnomalyDetectorSettings.TIME_DECAY, - AnomalyDetectorSettings.NUM_MIN_SAMPLES, - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE, + TimeSeriesSettings.NUM_TREES, + TimeSeriesSettings.NUM_SAMPLES_PER_TREE, + TimeSeriesSettings.TIME_DECAY, + TimeSeriesSettings.NUM_MIN_SAMPLES, + TimeSeriesSettings.THRESHOLD_MIN_PVALUE, AnomalyDetectorSettings.MIN_PREVIEW_SIZE, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ, + TimeSeriesSettings.HOURLY_MAINTENANCE, + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ, entityColdStarter, mock(FeatureManager.class), mock(MemoryTracker.class), @@ -756,10 +758,10 @@ private ModelState createStateForCacheRelease() { public void testCacheReleaseAfterMaintenance() throws IOException, InterruptedException { ModelState modelState = createStateForCacheRelease(); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(2); + ActionListener> listener = invocation.getArgument(3); listener.onResponse(Optional.of(1602269260000L)); return null; - }).when(searchFeatureDao).getEntityMinDataTime(any(), any(), any()); + }).when(searchFeatureDao).getMinDataTime(any(), any(), eq(AnalysisType.AD), any()); List> coldStartSamples = new ArrayList<>(); @@ -771,10 +773,10 @@ public void testCacheReleaseAfterMaintenance() throws IOException, InterruptedEx coldStartSamples.add(Optional.of(sample2)); coldStartSamples.add(Optional.of(sample3)); doAnswer(invocation -> { - ActionListener>> listener = invocation.getArgument(4); + ActionListener>> listener = invocation.getArgument(5); listener.onResponse(coldStartSamples); return null; - }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), any()); + }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), eq(AnalysisType.AD), any()); entityColdStarter.trainModel(entity, detectorId, modelState, listener); checkSemaphoreRelease(); @@ -788,7 +790,7 @@ public void testCacheReleaseAfterMaintenance() throws IOException, InterruptedEx // make sure when the next maintenance coming, current door keeper gets reset // note our detector interval is 1 minute and the door keeper will expire in 60 intervals, which are 60 minutes - when(clock.instant()).thenReturn(Instant.now().plus(AnomalyDetectorSettings.DOOR_KEEPER_MAINTENANCE_FREQ + 1, ChronoUnit.MINUTES)); + when(clock.instant()).thenReturn(Instant.now().plus(TimeSeriesSettings.DOOR_KEEPER_MAINTENANCE_FREQ + 1, ChronoUnit.MINUTES)); entityColdStarter.maintenance(); modelState = createStateForCacheRelease(); @@ -801,10 +803,10 @@ public void testCacheReleaseAfterMaintenance() throws IOException, InterruptedEx public void testCacheReleaseAfterClear() throws IOException, InterruptedException { ModelState modelState = createStateForCacheRelease(); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(2); + ActionListener> listener = invocation.getArgument(3); listener.onResponse(Optional.of(1602269260000L)); return null; - }).when(searchFeatureDao).getEntityMinDataTime(any(), any(), any()); + }).when(searchFeatureDao).getMinDataTime(any(), any(), eq(AnalysisType.AD), any()); List> coldStartSamples = new ArrayList<>(); @@ -816,10 +818,10 @@ public void testCacheReleaseAfterClear() throws IOException, InterruptedExceptio coldStartSamples.add(Optional.of(sample2)); coldStartSamples.add(Optional.of(sample3)); doAnswer(invocation -> { - ActionListener>> listener = invocation.getArgument(4); + ActionListener>> listener = invocation.getArgument(5); listener.onResponse(coldStartSamples); return null; - }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), any()); + }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), eq(AnalysisType.AD), any()); entityColdStarter.trainModel(entity, detectorId, modelState, listener); checkSemaphoreRelease(); diff --git a/src/test/java/org/opensearch/ad/ml/HCADModelPerfTests.java b/src/test/java/org/opensearch/ad/ml/HCADModelPerfTests.java index a5a7dc287..bf2732777 100644 --- a/src/test/java/org/opensearch/ad/ml/HCADModelPerfTests.java +++ b/src/test/java/org/opensearch/ad/ml/HCADModelPerfTests.java @@ -13,6 +13,7 @@ import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.anyBoolean; +import static org.mockito.ArgumentMatchers.eq; import static org.mockito.Mockito.doAnswer; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.reset; @@ -33,20 +34,22 @@ import org.apache.lucene.tests.util.TimeUnits; import org.opensearch.action.get.GetRequest; import org.opensearch.action.get.GetResponse; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.MemoryTracker; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.feature.FeatureManager; -import org.opensearch.ad.feature.SearchFeatureDao; import org.opensearch.ad.ml.ModelManager.ModelType; -import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.common.settings.ClusterSettings; import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.test.ClusterServiceUtils; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import com.carrotsearch.randomizedtesting.annotations.TimeoutSuite; import com.google.common.collect.ImmutableList; @@ -75,7 +78,7 @@ private void averageAccuracyTemplate( int baseDimension, boolean anomalyIndependent ) throws Exception { - int dataSize = 20 * AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE; + int dataSize = 20 * TimeSeriesSettings.NUM_SAMPLES_PER_TREE; int trainTestSplit = 300; // detector interval int interval = detectorIntervalMins; @@ -93,13 +96,13 @@ private void averageAccuracyTemplate( .newInstance() .setDetectionInterval(new IntervalTimeConfiguration(interval, ChronoUnit.MINUTES)) .setCategoryFields(ImmutableList.of(randomAlphaOfLength(5))) - .setShingleSize(AnomalyDetectorSettings.DEFAULT_SHINGLE_SIZE) + .setShingleSize(TimeSeriesSettings.DEFAULT_SHINGLE_SIZE) .build(); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(2); - listener.onResponse(TestHelpers.createGetResponse(detector, detector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX)); + listener.onResponse(TestHelpers.createGetResponse(detector, detector.getId(), CommonName.CONFIG_INDEX)); return null; }).when(clientUtil).asyncRequest(any(GetRequest.class), any(), any(ActionListener.class)); @@ -113,7 +116,7 @@ private void averageAccuracyTemplate( featureManager = new FeatureManager( searchFeatureDao, - interpolator, + imputer, clock, AnomalyDetectorSettings.MAX_TRAIN_SAMPLE, AnomalyDetectorSettings.MAX_SAMPLE_STRIDE, @@ -123,43 +126,43 @@ private void averageAccuracyTemplate( AnomalyDetectorSettings.MAX_IMPUTATION_NEIGHBOR_DISTANCE, AnomalyDetectorSettings.PREVIEW_SAMPLE_RATE, AnomalyDetectorSettings.MAX_PREVIEW_SAMPLES, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, threadPool, - AnomalyDetectorPlugin.AD_THREAD_POOL_NAME + TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME ); entityColdStarter = new EntityColdStarter( clock, threadPool, stateManager, - AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE, - AnomalyDetectorSettings.NUM_TREES, - AnomalyDetectorSettings.TIME_DECAY, + TimeSeriesSettings.NUM_SAMPLES_PER_TREE, + TimeSeriesSettings.NUM_TREES, + TimeSeriesSettings.TIME_DECAY, numMinSamples, AnomalyDetectorSettings.MAX_SAMPLE_STRIDE, AnomalyDetectorSettings.MAX_TRAIN_SAMPLE, - interpolator, + imputer, searchFeatureDao, - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE, + TimeSeriesSettings.THRESHOLD_MIN_PVALUE, featureManager, settings, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, checkpointWriteQueue, seed, - AnomalyDetectorSettings.MAX_COLD_START_ROUNDS + TimeSeriesSettings.MAX_COLD_START_ROUNDS ); modelManager = new ModelManager( mock(CheckpointDao.class), mock(Clock.class), - AnomalyDetectorSettings.NUM_TREES, - AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE, - AnomalyDetectorSettings.TIME_DECAY, - AnomalyDetectorSettings.NUM_MIN_SAMPLES, - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE, + TimeSeriesSettings.NUM_TREES, + TimeSeriesSettings.NUM_SAMPLES_PER_TREE, + TimeSeriesSettings.TIME_DECAY, + TimeSeriesSettings.NUM_MIN_SAMPLES, + TimeSeriesSettings.THRESHOLD_MIN_PVALUE, AnomalyDetectorSettings.MIN_PREVIEW_SIZE, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ, + TimeSeriesSettings.HOURLY_MAINTENANCE, + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ, entityColdStarter, mock(FeatureManager.class), mock(MemoryTracker.class), @@ -187,10 +190,10 @@ private void averageAccuracyTemplate( when(clock.millis()).thenReturn(timestamps[trainTestSplit - 1]); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(2); + ActionListener> listener = invocation.getArgument(3); listener.onResponse(Optional.of(timestamps[0])); return null; - }).when(searchFeatureDao).getEntityMinDataTime(any(), any(), any()); + }).when(searchFeatureDao).getMinDataTime(any(), any(), eq(AnalysisType.AD), any()); doAnswer(invocation -> { List> ranges = invocation.getArgument(1); @@ -209,17 +212,17 @@ public int compare(Entry p1, Entry p2) { coldStartSamples.add(Optional.of(data[valueIndex])); } - ActionListener>> listener = invocation.getArgument(4); + ActionListener>> listener = invocation.getArgument(5); listener.onResponse(coldStartSamples); return null; - }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), any()); + }).when(searchFeatureDao).getColdStartSamplesForPeriods(any(), any(), any(), anyBoolean(), eq(AnalysisType.AD), any()); entity = Entity.createSingleAttributeEntity("field", entityName + z); EntityModel model = new EntityModel(entity, new ArrayDeque<>(), null); ModelState modelState = new ModelState<>( model, entity.getModelId(detectorId).get(), - detector.getDetectorId(), + detector.getId(), ModelType.ENTITY.getName(), clock, priority @@ -233,7 +236,7 @@ public int compare(Entry p1, Entry p2) { inProgressLatch.countDown(); }); - entityColdStarter.trainModel(entity, detector.getDetectorId(), modelState, listener); + entityColdStarter.trainModel(entity, detector.getId(), modelState, listener); checkSemaphoreRelease(); assertTrue(model.getTrcf().isPresent()); diff --git a/src/test/java/org/opensearch/ad/ml/ModelManagerTests.java b/src/test/java/org/opensearch/ad/ml/ModelManagerTests.java index 8cf80a89d..cbb7b09ba 100644 --- a/src/test/java/org/opensearch/ad/ml/ModelManagerTests.java +++ b/src/test/java/org/opensearch/ad/ml/ModelManagerTests.java @@ -51,24 +51,12 @@ import org.mockito.ArgumentCaptor; import org.mockito.Mock; import org.mockito.MockitoAnnotations; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.MemoryTracker; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.caching.EntityCache; -import org.opensearch.ad.common.exception.LimitExceededException; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.dataprocessor.IntegerSensitiveSingleFeatureLinearUniformInterpolator; -import org.opensearch.ad.dataprocessor.LinearUniformInterpolator; -import org.opensearch.ad.dataprocessor.SingleFeatureLinearUniformInterpolator; import org.opensearch.ad.feature.FeatureManager; -import org.opensearch.ad.feature.SearchFeatureDao; import org.opensearch.ad.ml.ModelManager.ModelType; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.ratelimit.CheckpointWriteWorker; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Settings; @@ -76,6 +64,18 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.monitor.jvm.JvmService; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.dataprocessor.LinearUniformImputer; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.ml.SingleStreamModelIdMapper; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; import com.amazon.randomcutforest.RandomCutForest; import com.amazon.randomcutforest.parkservices.AnomalyDescriptor; @@ -163,7 +163,7 @@ public class ModelManagerTests { private Instant now; @Mock - private ADCircuitBreakerService adCircuitBreakerService; + private CircuitBreakerService adCircuitBreakerService; private String modelId = "modelId"; @@ -205,7 +205,7 @@ public void setup() { when(rcf.process(any(), anyLong())).thenReturn(descriptor); ExecutorService executorService = mock(ExecutorService.class); - when(threadPool.executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); + when(threadPool.executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); doAnswer(invocation -> { Runnable runnable = invocation.getArgument(0); runnable.run(); @@ -235,7 +235,7 @@ public void setup() { thresholdMinPvalue, minPreviewSize, modelTtl, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ, + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ, entityColdStarter, featureManager, memoryTracker, @@ -409,13 +409,7 @@ public void getRcfResult_throwToListener_whenHeapLimitExceed() { when(jvmService.info().getMem().getHeapMax().getBytes()).thenReturn(1_000L); - MemoryTracker memoryTracker = new MemoryTracker( - jvmService, - modelMaxSizePercentage, - modelDesiredSizePercentage, - null, - adCircuitBreakerService - ); + MemoryTracker memoryTracker = new MemoryTracker(jvmService, modelMaxSizePercentage, null, adCircuitBreakerService); ActionListener listener = mock(ActionListener.class); @@ -431,7 +425,7 @@ public void getRcfResult_throwToListener_whenHeapLimitExceed() { thresholdMinPvalue, minPreviewSize, modelTtl, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ, + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ, entityColdStarter, featureManager, memoryTracker, @@ -694,14 +688,14 @@ public void trainModel_throwLimitExceededToListener_whenLimitExceed() { @Test public void getRcfModelId_returnNonEmptyString() { - String rcfModelId = SingleStreamModelIdMapper.getRcfModelId(anomalyDetector.getDetectorId(), 0); + String rcfModelId = SingleStreamModelIdMapper.getRcfModelId(anomalyDetector.getId(), 0); assertFalse(rcfModelId.isEmpty()); } @Test public void getThresholdModelId_returnNonEmptyString() { - String thresholdModelId = SingleStreamModelIdMapper.getThresholdModelId(anomalyDetector.getDetectorId()); + String thresholdModelId = SingleStreamModelIdMapper.getThresholdModelId(anomalyDetector.getId()); assertFalse(thresholdModelId.isEmpty()); } @@ -908,9 +902,7 @@ public void getNullState() { public void getEmptyStateFullSamples() { SearchFeatureDao searchFeatureDao = mock(SearchFeatureDao.class); - SingleFeatureLinearUniformInterpolator singleFeatureLinearUniformInterpolator = - new IntegerSensitiveSingleFeatureLinearUniformInterpolator(); - LinearUniformInterpolator interpolator = new LinearUniformInterpolator(singleFeatureLinearUniformInterpolator); + LinearUniformImputer interpolator = new LinearUniformImputer(true); NodeStateManager stateManager = mock(NodeStateManager.class); featureManager = new FeatureManager( @@ -925,9 +917,9 @@ public void getEmptyStateFullSamples() { AnomalyDetectorSettings.MAX_IMPUTATION_NEIGHBOR_DISTANCE, AnomalyDetectorSettings.PREVIEW_SAMPLE_RATE, AnomalyDetectorSettings.MAX_PREVIEW_SAMPLES, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, threadPool, - AnomalyDetectorPlugin.AD_THREAD_POOL_NAME + TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME ); CheckpointWriteWorker checkpointWriteQueue = mock(CheckpointWriteWorker.class); @@ -936,20 +928,20 @@ public void getEmptyStateFullSamples() { clock, threadPool, stateManager, - AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE, - AnomalyDetectorSettings.NUM_TREES, - AnomalyDetectorSettings.TIME_DECAY, + TimeSeriesSettings.NUM_SAMPLES_PER_TREE, + TimeSeriesSettings.NUM_TREES, + TimeSeriesSettings.TIME_DECAY, numMinSamples, AnomalyDetectorSettings.MAX_SAMPLE_STRIDE, AnomalyDetectorSettings.MAX_TRAIN_SAMPLE, interpolator, searchFeatureDao, - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE, + TimeSeriesSettings.THRESHOLD_MIN_PVALUE, featureManager, settings, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, checkpointWriteQueue, - AnomalyDetectorSettings.MAX_COLD_START_ROUNDS + TimeSeriesSettings.MAX_COLD_START_ROUNDS ); modelManager = spy( @@ -963,7 +955,7 @@ public void getEmptyStateFullSamples() { thresholdMinPvalue, minPreviewSize, modelTtl, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ, + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ, entityColdStarter, featureManager, memoryTracker, diff --git a/src/test/java/org/opensearch/ad/ml/SingleStreamModelIdMapperTests.java b/src/test/java/org/opensearch/ad/ml/SingleStreamModelIdMapperTests.java index 59a0d02da..bda02043a 100644 --- a/src/test/java/org/opensearch/ad/ml/SingleStreamModelIdMapperTests.java +++ b/src/test/java/org/opensearch/ad/ml/SingleStreamModelIdMapperTests.java @@ -12,6 +12,7 @@ package org.opensearch.ad.ml; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.ml.SingleStreamModelIdMapper; public class SingleStreamModelIdMapperTests extends OpenSearchTestCase { public void testGetThresholdModelIdFromRCFModelId() { diff --git a/src/test/java/org/opensearch/ad/ml/ThresholdingResultTests.java b/src/test/java/org/opensearch/ad/ml/ThresholdingResultTests.java index 1c73dceff..cd63f60d1 100644 --- a/src/test/java/org/opensearch/ad/ml/ThresholdingResultTests.java +++ b/src/test/java/org/opensearch/ad/ml/ThresholdingResultTests.java @@ -15,6 +15,10 @@ import org.junit.Test; import org.junit.runner.RunWith; +import org.opensearch.ad.model.AnomalyResult; + +import junitparams.JUnitParamsRunner; +import junitparams.Parameters; import junitparams.JUnitParamsRunner; import junitparams.Parameters; @@ -36,6 +40,9 @@ public void getters_returnExcepted() { private Object[] equalsData() { return new Object[] { + new Object[] { thresholdingResult, thresholdingResult, true }, + new Object[] { thresholdingResult, null, false }, + new Object[] { thresholdingResult, AnomalyResult.getDummyResult(), false }, new Object[] { thresholdingResult, null, false }, new Object[] { thresholdingResult, thresholdingResult, true }, new Object[] { thresholdingResult, 1, false }, diff --git a/src/test/java/org/opensearch/ad/mock/plugin/MockReindexPlugin.java b/src/test/java/org/opensearch/ad/mock/plugin/MockReindexPlugin.java index fed9b47ac..f8c54b112 100644 --- a/src/test/java/org/opensearch/ad/mock/plugin/MockReindexPlugin.java +++ b/src/test/java/org/opensearch/ad/mock/plugin/MockReindexPlugin.java @@ -25,8 +25,7 @@ import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; import org.opensearch.action.support.WriteRequest; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.mock.transport.MockAnomalyDetectorJobAction; import org.opensearch.ad.mock.transport.MockAnomalyDetectorJobTransportActionWithUser; import org.opensearch.client.Client; @@ -45,6 +44,7 @@ import org.opensearch.plugins.Plugin; import org.opensearch.search.SearchHit; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.TestHelpers; import org.opensearch.transport.TransportService; import com.google.common.collect.ImmutableList; @@ -114,7 +114,7 @@ protected void doExecute(Task task, DeleteByQueryRequest request, ActionListener BulkRequestBuilder bulkRequestBuilder = client.prepareBulk(); while (iterator.hasNext()) { String id = iterator.next().getId(); - DeleteRequest deleteRequest = new DeleteRequest(CommonName.DETECTION_STATE_INDEX, id); + DeleteRequest deleteRequest = new DeleteRequest(ADCommonName.DETECTION_STATE_INDEX, id); bulkRequestBuilder.add(deleteRequest); } BulkRequest bulkRequest = bulkRequestBuilder.request().setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE); diff --git a/src/test/java/org/opensearch/ad/mock/transport/MockADCancelTaskNodeRequest_1_0.java b/src/test/java/org/opensearch/ad/mock/transport/MockADCancelTaskNodeRequest_1_0.java index 01cd56276..266ec5bd0 100644 --- a/src/test/java/org/opensearch/ad/mock/transport/MockADCancelTaskNodeRequest_1_0.java +++ b/src/test/java/org/opensearch/ad/mock/transport/MockADCancelTaskNodeRequest_1_0.java @@ -39,7 +39,7 @@ public void writeTo(StreamOutput out) throws IOException { out.writeOptionalString(userName); } - public String getDetectorId() { + public String getId() { return detectorId; } diff --git a/src/test/java/org/opensearch/ad/mock/transport/MockAnomalyDetectorJobAction.java b/src/test/java/org/opensearch/ad/mock/transport/MockAnomalyDetectorJobAction.java index 327e3bf51..a861ec9de 100644 --- a/src/test/java/org/opensearch/ad/mock/transport/MockAnomalyDetectorJobAction.java +++ b/src/test/java/org/opensearch/ad/mock/transport/MockAnomalyDetectorJobAction.java @@ -13,15 +13,15 @@ import org.opensearch.action.ActionType; import org.opensearch.ad.constant.CommonValue; -import org.opensearch.ad.transport.AnomalyDetectorJobResponse; +import org.opensearch.timeseries.transport.JobResponse; -public class MockAnomalyDetectorJobAction extends ActionType { +public class MockAnomalyDetectorJobAction extends ActionType { // External Action which used for public facing RestAPIs. public static final String NAME = CommonValue.EXTERNAL_ACTION_PREFIX + "detector/mockjobmanagement"; public static final MockAnomalyDetectorJobAction INSTANCE = new MockAnomalyDetectorJobAction(); private MockAnomalyDetectorJobAction() { - super(NAME, AnomalyDetectorJobResponse::new); + super(NAME, JobResponse::new); } } diff --git a/src/test/java/org/opensearch/ad/mock/transport/MockAnomalyDetectorJobTransportActionWithUser.java b/src/test/java/org/opensearch/ad/mock/transport/MockAnomalyDetectorJobTransportActionWithUser.java index ad2cce014..48425a747 100644 --- a/src/test/java/org/opensearch/ad/mock/transport/MockAnomalyDetectorJobTransportActionWithUser.java +++ b/src/test/java/org/opensearch/ad/mock/transport/MockAnomalyDetectorJobTransportActionWithUser.java @@ -11,23 +11,21 @@ package org.opensearch.ad.mock.transport; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.REQUEST_TIMEOUT; -import static org.opensearch.ad.util.ParseUtils.resolveUserAndExecute; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_REQUEST_TIMEOUT; +import static org.opensearch.timeseries.util.ParseUtils.resolveUserAndExecute; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; import org.opensearch.ad.ExecuteADResultResponseRecorder; -import org.opensearch.ad.indices.AnomalyDetectionIndices; -import org.opensearch.ad.model.DetectionDateRange; +import org.opensearch.ad.indices.ADIndexManagement; +import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.rest.handler.IndexAnomalyDetectorJobActionHandler; import org.opensearch.ad.task.ADTaskManager; import org.opensearch.ad.transport.AnomalyDetectorJobRequest; -import org.opensearch.ad.transport.AnomalyDetectorJobResponse; import org.opensearch.ad.transport.AnomalyDetectorJobTransportAction; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.inject.Inject; @@ -38,16 +36,18 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.core.xcontent.NamedXContentRegistry; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.transport.JobResponse; +import org.opensearch.timeseries.util.RestHandlerUtils; import org.opensearch.transport.TransportService; -public class MockAnomalyDetectorJobTransportActionWithUser extends - HandledTransportAction { +public class MockAnomalyDetectorJobTransportActionWithUser extends HandledTransportAction { private final Logger logger = LogManager.getLogger(AnomalyDetectorJobTransportAction.class); private final Client client; private final ClusterService clusterService; private final Settings settings; - private final AnomalyDetectionIndices anomalyDetectionIndices; + private final ADIndexManagement anomalyDetectionIndices; private final NamedXContentRegistry xContentRegistry; private volatile Boolean filterByEnabled; private ThreadContext.StoredContext context; @@ -62,7 +62,7 @@ public MockAnomalyDetectorJobTransportActionWithUser( Client client, ClusterService clusterService, Settings settings, - AnomalyDetectionIndices anomalyDetectionIndices, + ADIndexManagement anomalyDetectionIndices, NamedXContentRegistry xContentRegistry, ADTaskManager adTaskManager, ExecuteADResultResponseRecorder recorder @@ -75,8 +75,8 @@ public MockAnomalyDetectorJobTransportActionWithUser( this.anomalyDetectionIndices = anomalyDetectionIndices; this.xContentRegistry = xContentRegistry; this.adTaskManager = adTaskManager; - filterByEnabled = FILTER_BY_BACKEND_ROLES.get(settings); - clusterService.getClusterSettings().addSettingsUpdateConsumer(FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); + filterByEnabled = AD_FILTER_BY_BACKEND_ROLES.get(settings); + clusterService.getClusterSettings().addSettingsUpdateConsumer(AD_FILTER_BY_BACKEND_ROLES, it -> filterByEnabled = it); ThreadContext threadContext = new ThreadContext(settings); context = threadContext.stashContext(); @@ -84,14 +84,14 @@ public MockAnomalyDetectorJobTransportActionWithUser( } @Override - protected void doExecute(Task task, AnomalyDetectorJobRequest request, ActionListener listener) { + protected void doExecute(Task task, AnomalyDetectorJobRequest request, ActionListener listener) { String detectorId = request.getDetectorID(); - DetectionDateRange detectionDateRange = request.getDetectionDateRange(); + DateRange detectionDateRange = request.getDetectionDateRange(); boolean historical = request.isHistorical(); long seqNo = request.getSeqNo(); long primaryTerm = request.getPrimaryTerm(); String rawPath = request.getRawPath(); - TimeValue requestTimeout = REQUEST_TIMEOUT.get(settings); + TimeValue requestTimeout = AD_REQUEST_TIMEOUT.get(settings); String userStr = "user_name|backendrole1,backendrole2|roles1,role2"; // By the time request reaches here, the user permissions are validated by Security plugin. User user = User.parse(userStr); @@ -114,7 +114,8 @@ protected void doExecute(Task task, AnomalyDetectorJobRequest request, ActionLis ), client, clusterService, - xContentRegistry + xContentRegistry, + AnomalyDetector.class ); } catch (Exception e) { logger.error(e); @@ -123,14 +124,14 @@ protected void doExecute(Task task, AnomalyDetectorJobRequest request, ActionLis } private void executeDetector( - ActionListener listener, + ActionListener listener, String detectorId, long seqNo, long primaryTerm, String rawPath, TimeValue requestTimeout, User user, - DetectionDateRange detectionDateRange, + DateRange detectionDateRange, boolean historical ) { IndexAnomalyDetectorJobActionHandler handler = new IndexAnomalyDetectorJobActionHandler( diff --git a/src/test/java/org/opensearch/ad/mock/transport/MockForwardADTaskRequest_1_0.java b/src/test/java/org/opensearch/ad/mock/transport/MockForwardADTaskRequest_1_0.java index c38d05c9d..8b4f5e0d3 100644 --- a/src/test/java/org/opensearch/ad/mock/transport/MockForwardADTaskRequest_1_0.java +++ b/src/test/java/org/opensearch/ad/mock/transport/MockForwardADTaskRequest_1_0.java @@ -17,7 +17,7 @@ import org.opensearch.action.ActionRequest; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.commons.authuser.User; import org.opensearch.core.common.io.stream.StreamInput; @@ -60,12 +60,12 @@ public void writeTo(StreamOutput out) throws IOException { public ActionRequestValidationException validate() { ActionRequestValidationException validationException = null; if (detector == null) { - validationException = addValidationError(CommonErrorMessages.DETECTOR_MISSING, validationException); - } else if (detector.getDetectorId() == null) { - validationException = addValidationError(CommonErrorMessages.AD_ID_MISSING_MSG, validationException); + validationException = addValidationError(ADCommonMessages.DETECTOR_MISSING, validationException); + } else if (detector.getId() == null) { + validationException = addValidationError(ADCommonMessages.AD_ID_MISSING_MSG, validationException); } if (adTaskAction == null) { - validationException = addValidationError(CommonErrorMessages.AD_TASK_ACTION_MISSING, validationException); + validationException = addValidationError(ADCommonMessages.AD_TASK_ACTION_MISSING, validationException); } return validationException; } diff --git a/src/test/java/org/opensearch/ad/model/ADEntityTaskProfileTests.java b/src/test/java/org/opensearch/ad/model/ADEntityTaskProfileTests.java index e13351d5b..27456589a 100644 --- a/src/test/java/org/opensearch/ad/model/ADEntityTaskProfileTests.java +++ b/src/test/java/org/opensearch/ad/model/ADEntityTaskProfileTests.java @@ -9,8 +9,6 @@ import java.util.Collection; import java.util.TreeMap; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.TestHelpers; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; import org.opensearch.core.common.io.stream.NamedWriteableRegistry; @@ -18,12 +16,15 @@ import org.opensearch.plugins.Plugin; import org.opensearch.test.InternalSettingsPlugin; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.model.Entity; public class ADEntityTaskProfileTests extends OpenSearchSingleNodeTestCase { @Override protected Collection> getPlugins() { - return pluginList(InternalSettingsPlugin.class, AnomalyDetectorPlugin.class); + return pluginList(InternalSettingsPlugin.class, TimeSeriesAnalyticsPlugin.class); } @Override diff --git a/src/test/java/org/opensearch/ad/model/ADTaskTests.java b/src/test/java/org/opensearch/ad/model/ADTaskTests.java index 179c6cafa..d97dc15dd 100644 --- a/src/test/java/org/opensearch/ad/model/ADTaskTests.java +++ b/src/test/java/org/opensearch/ad/model/ADTaskTests.java @@ -16,8 +16,6 @@ import java.time.temporal.ChronoUnit; import java.util.Collection; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.TestHelpers; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; import org.opensearch.core.common.io.stream.NamedWriteableRegistry; @@ -25,12 +23,15 @@ import org.opensearch.plugins.Plugin; import org.opensearch.test.InternalSettingsPlugin; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.model.TaskState; public class ADTaskTests extends OpenSearchSingleNodeTestCase { @Override protected Collection> getPlugins() { - return pluginList(InternalSettingsPlugin.class, AnomalyDetectorPlugin.class); + return pluginList(InternalSettingsPlugin.class, TimeSeriesAnalyticsPlugin.class); } @Override @@ -39,7 +40,7 @@ protected NamedWriteableRegistry writableRegistry() { } public void testAdTaskSerialization() throws IOException { - ADTask adTask = TestHelpers.randomAdTask(randomAlphaOfLength(5), ADTaskState.STOPPED, Instant.now(), randomAlphaOfLength(5), true); + ADTask adTask = TestHelpers.randomAdTask(randomAlphaOfLength(5), TaskState.STOPPED, Instant.now(), randomAlphaOfLength(5), true); BytesStreamOutput output = new BytesStreamOutput(); adTask.writeTo(output); NamedWriteableAwareStreamInput input = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); @@ -48,7 +49,7 @@ public void testAdTaskSerialization() throws IOException { } public void testAdTaskSerializationWithNullDetector() throws IOException { - ADTask adTask = TestHelpers.randomAdTask(randomAlphaOfLength(5), ADTaskState.STOPPED, Instant.now(), randomAlphaOfLength(5), false); + ADTask adTask = TestHelpers.randomAdTask(randomAlphaOfLength(5), TaskState.STOPPED, Instant.now(), randomAlphaOfLength(5), false); BytesStreamOutput output = new BytesStreamOutput(); adTask.writeTo(output); NamedWriteableAwareStreamInput input = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); @@ -58,7 +59,7 @@ public void testAdTaskSerializationWithNullDetector() throws IOException { public void testParseADTask() throws IOException { ADTask adTask = TestHelpers - .randomAdTask(null, ADTaskState.STOPPED, Instant.now().truncatedTo(ChronoUnit.SECONDS), randomAlphaOfLength(5), true); + .randomAdTask(null, TaskState.STOPPED, Instant.now().truncatedTo(ChronoUnit.SECONDS), randomAlphaOfLength(5), true); String taskId = randomAlphaOfLength(5); adTask.setTaskId(taskId); String adTaskString = TestHelpers.xContentBuilderToString(adTask.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); @@ -69,7 +70,7 @@ public void testParseADTask() throws IOException { public void testParseADTaskWithoutTaskId() throws IOException { String taskId = null; ADTask adTask = TestHelpers - .randomAdTask(taskId, ADTaskState.STOPPED, Instant.now().truncatedTo(ChronoUnit.SECONDS), randomAlphaOfLength(5), true); + .randomAdTask(taskId, TaskState.STOPPED, Instant.now().truncatedTo(ChronoUnit.SECONDS), randomAlphaOfLength(5), true); String adTaskString = TestHelpers.xContentBuilderToString(adTask.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); ADTask parsedADTask = ADTask.parse(TestHelpers.parser(adTaskString)); assertEquals("Parsing AD task doesn't work", adTask, parsedADTask); @@ -78,7 +79,7 @@ public void testParseADTaskWithoutTaskId() throws IOException { public void testParseADTaskWithNullDetector() throws IOException { String taskId = randomAlphaOfLength(5); ADTask adTask = TestHelpers - .randomAdTask(taskId, ADTaskState.STOPPED, Instant.now().truncatedTo(ChronoUnit.SECONDS), randomAlphaOfLength(5), false); + .randomAdTask(taskId, TaskState.STOPPED, Instant.now().truncatedTo(ChronoUnit.SECONDS), randomAlphaOfLength(5), false); String adTaskString = TestHelpers.xContentBuilderToString(adTask.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); ADTask parsedADTask = ADTask.parse(TestHelpers.parser(adTaskString), taskId); assertEquals("Parsing AD task doesn't work", adTask, parsedADTask); diff --git a/src/test/java/org/opensearch/ad/model/AnomalyDetectorExecutionInputTests.java b/src/test/java/org/opensearch/ad/model/AnomalyDetectorExecutionInputTests.java index ccd1b32e2..d383aed3d 100644 --- a/src/test/java/org/opensearch/ad/model/AnomalyDetectorExecutionInputTests.java +++ b/src/test/java/org/opensearch/ad/model/AnomalyDetectorExecutionInputTests.java @@ -16,9 +16,9 @@ import java.time.temporal.ChronoUnit; import java.util.Locale; -import org.opensearch.ad.TestHelpers; import org.opensearch.core.xcontent.ToXContent; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.TestHelpers; public class AnomalyDetectorExecutionInputTests extends OpenSearchTestCase { diff --git a/src/test/java/org/opensearch/ad/model/AnomalyDetectorJobTests.java b/src/test/java/org/opensearch/ad/model/AnomalyDetectorJobTests.java index 872595153..df506b010 100644 --- a/src/test/java/org/opensearch/ad/model/AnomalyDetectorJobTests.java +++ b/src/test/java/org/opensearch/ad/model/AnomalyDetectorJobTests.java @@ -15,8 +15,6 @@ import java.util.Collection; import java.util.Locale; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.TestHelpers; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; import org.opensearch.core.common.io.stream.NamedWriteableRegistry; @@ -24,12 +22,15 @@ import org.opensearch.plugins.Plugin; import org.opensearch.test.InternalSettingsPlugin; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.model.Job; public class AnomalyDetectorJobTests extends OpenSearchSingleNodeTestCase { @Override protected Collection> getPlugins() { - return pluginList(InternalSettingsPlugin.class, AnomalyDetectorPlugin.class); + return pluginList(InternalSettingsPlugin.class, TimeSeriesAnalyticsPlugin.class); } @Override @@ -38,22 +39,22 @@ protected NamedWriteableRegistry writableRegistry() { } public void testParseAnomalyDetectorJob() throws IOException { - AnomalyDetectorJob anomalyDetectorJob = TestHelpers.randomAnomalyDetectorJob(); + Job anomalyDetectorJob = TestHelpers.randomAnomalyDetectorJob(); String anomalyDetectorJobString = TestHelpers .xContentBuilderToString(anomalyDetectorJob.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); anomalyDetectorJobString = anomalyDetectorJobString .replaceFirst("\\{", String.format(Locale.ROOT, "{\"%s\":\"%s\",", randomAlphaOfLength(5), randomAlphaOfLength(5))); - AnomalyDetectorJob parsedAnomalyDetectorJob = AnomalyDetectorJob.parse(TestHelpers.parser(anomalyDetectorJobString)); + Job parsedAnomalyDetectorJob = Job.parse(TestHelpers.parser(anomalyDetectorJobString)); assertEquals("Parsing anomaly detect result doesn't work", anomalyDetectorJob, parsedAnomalyDetectorJob); } public void testSerialization() throws IOException { - AnomalyDetectorJob anomalyDetectorJob = TestHelpers.randomAnomalyDetectorJob(); + Job anomalyDetectorJob = TestHelpers.randomAnomalyDetectorJob(); BytesStreamOutput output = new BytesStreamOutput(); anomalyDetectorJob.writeTo(output); NamedWriteableAwareStreamInput input = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); - AnomalyDetectorJob parsedAnomalyDetectorJob = new AnomalyDetectorJob(input); + Job parsedAnomalyDetectorJob = new Job(input); assertNotNull(parsedAnomalyDetectorJob); } } diff --git a/src/test/java/org/opensearch/ad/model/AnomalyDetectorSerializationTests.java b/src/test/java/org/opensearch/ad/model/AnomalyDetectorSerializationTests.java index 4d9cf9d95..aa32bf495 100644 --- a/src/test/java/org/opensearch/ad/model/AnomalyDetectorSerializationTests.java +++ b/src/test/java/org/opensearch/ad/model/AnomalyDetectorSerializationTests.java @@ -15,14 +15,14 @@ import java.time.Instant; import java.util.Collection; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.TestHelpers; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; import org.opensearch.core.common.io.stream.NamedWriteableRegistry; import org.opensearch.plugins.Plugin; import org.opensearch.test.InternalSettingsPlugin; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -30,7 +30,7 @@ public class AnomalyDetectorSerializationTests extends OpenSearchSingleNodeTestCase { @Override protected Collection> getPlugins() { - return pluginList(InternalSettingsPlugin.class, AnomalyDetectorPlugin.class); + return pluginList(InternalSettingsPlugin.class, TimeSeriesAnalyticsPlugin.class); } @Override diff --git a/src/test/java/org/opensearch/ad/model/AnomalyDetectorTests.java b/src/test/java/org/opensearch/ad/model/AnomalyDetectorTests.java index 533099594..d3298eae2 100644 --- a/src/test/java/org/opensearch/ad/model/AnomalyDetectorTests.java +++ b/src/test/java/org/opensearch/ad/model/AnomalyDetectorTests.java @@ -11,11 +11,10 @@ package org.opensearch.ad.model; -import static org.opensearch.ad.constant.CommonErrorMessages.INVALID_CHAR_IN_RESULT_INDEX_NAME; -import static org.opensearch.ad.constant.CommonErrorMessages.INVALID_RESULT_INDEX_NAME_SIZE; -import static org.opensearch.ad.constant.CommonErrorMessages.INVALID_RESULT_INDEX_PREFIX; -import static org.opensearch.ad.constant.CommonName.CUSTOM_RESULT_INDEX_PREFIX; +import static org.opensearch.ad.constant.ADCommonMessages.INVALID_RESULT_INDEX_PREFIX; +import static org.opensearch.ad.constant.ADCommonName.CUSTOM_RESULT_INDEX_PREFIX; import static org.opensearch.ad.model.AnomalyDetector.MAX_RESULT_INDEX_NAME_SIZE; +import static org.opensearch.timeseries.constant.CommonMessages.INVALID_CHAR_IN_RESULT_INDEX_NAME; import java.io.IOException; import java.time.Instant; @@ -23,20 +22,21 @@ import java.util.Locale; import java.util.concurrent.TimeUnit; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.common.exception.ADValidationException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.settings.AnomalyDetectorSettings; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.common.unit.TimeValue; import org.opensearch.core.xcontent.ToXContent; import org.opensearch.index.query.MatchAllQueryBuilder; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.ValidationException; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; -public class AnomalyDetectorTests extends AbstractADTest { +public class AnomalyDetectorTests extends AbstractTimeSeriesTest { public void testParseAnomalyDetector() throws IOException { AnomalyDetector detector = TestHelpers.randomAnomalyDetector(TestHelpers.randomUiMetadata(), Instant.now()); @@ -49,7 +49,7 @@ public void testParseAnomalyDetector() throws IOException { } public void testParseAnomalyDetectorWithCustomIndex() throws IOException { - String resultIndex = CommonName.CUSTOM_RESULT_INDEX_PREFIX + "test"; + String resultIndex = ADCommonName.CUSTOM_RESULT_INDEX_PREFIX + "test"; AnomalyDetector detector = TestHelpers .randomDetector( ImmutableList.of(TestHelpers.randomFeature()), @@ -64,15 +64,15 @@ public void testParseAnomalyDetectorWithCustomIndex() throws IOException { detectorString = detectorString .replaceFirst("\\{", String.format(Locale.ROOT, "{\"%s\":\"%s\",", randomAlphaOfLength(5), randomAlphaOfLength(5))); AnomalyDetector parsedDetector = AnomalyDetector.parse(TestHelpers.parser(detectorString)); - assertEquals("Parsing result index doesn't work", resultIndex, parsedDetector.getResultIndex()); + assertEquals("Parsing result index doesn't work", resultIndex, parsedDetector.getCustomResultIndex()); assertEquals("Parsing anomaly detector doesn't work", detector, parsedDetector); } public void testAnomalyDetectorWithInvalidCustomIndex() throws Exception { - String resultIndex = CommonName.CUSTOM_RESULT_INDEX_PREFIX + "test@@"; + String resultIndex = ADCommonName.CUSTOM_RESULT_INDEX_PREFIX + "test@@"; TestHelpers .assertFailWith( - ADValidationException.class, + ValidationException.class, () -> (TestHelpers .randomDetector( ImmutableList.of(TestHelpers.randomFeature()), @@ -104,13 +104,7 @@ public void testParseAnomalyDetectorWithCustomDetectionDelay() throws IOExceptio detectorString = detectorString .replaceFirst("\\{", String.format(Locale.ROOT, "{\"%s\":\"%s\",", randomAlphaOfLength(5), randomAlphaOfLength(5))); AnomalyDetector parsedDetector = AnomalyDetector - .parse( - TestHelpers.parser(detectorString), - detector.getDetectorId(), - detector.getVersion(), - detectionInterval, - detectionWindowDelay - ); + .parse(TestHelpers.parser(detectorString), detector.getId(), detector.getVersion(), detectionInterval, detectionWindowDelay); assertEquals("Parsing anomaly detector doesn't work", detector, parsedDetector); } @@ -184,7 +178,7 @@ public void testParseAnomalyDetectorWithWrongFilterQuery() throws Exception { + "-1203962153,\"ui_metadata\":{\"JbAaV\":{\"feature_id\":\"rIFjS\",\"feature_name\":\"QXCmS\"," + "\"feature_enabled\":false,\"aggregation_query\":{\"aa\":{\"value_count\":{\"field\":\"ok\"}}}}}," + "\"last_update_time\":1568396089028}"; - TestHelpers.assertFailWith(ADValidationException.class, () -> AnomalyDetector.parse(TestHelpers.parser(detectorString))); + TestHelpers.assertFailWith(ValidationException.class, () -> AnomalyDetector.parse(TestHelpers.parser(detectorString))); } public void testParseAnomalyDetectorWithoutOptionalParams() throws IOException { @@ -197,7 +191,7 @@ public void testParseAnomalyDetectorWithoutOptionalParams() throws IOException { + "\"aggregation_query\":{\"aa\":{\"value_count\":{\"field\":\"ok\"}}}}},\"last_update_time\":1568396089028}"; AnomalyDetector parsedDetector = AnomalyDetector.parse(TestHelpers.parser(detectorString), "id", 1L, null, null); assertTrue(parsedDetector.getFilterQuery() instanceof MatchAllQueryBuilder); - assertEquals((long) parsedDetector.getShingleSize(), (long) AnomalyDetectorSettings.DEFAULT_SHINGLE_SIZE); + assertEquals((long) parsedDetector.getShingleSize(), (long) TimeSeriesSettings.DEFAULT_SHINGLE_SIZE); } public void testParseAnomalyDetectorWithInvalidShingleSize() throws Exception { @@ -208,7 +202,7 @@ public void testParseAnomalyDetectorWithInvalidShingleSize() throws Exception { + "{\"period\":{\"interval\":425,\"unit\":\"Minutes\"}},\"shingle_size\":-1,\"schema_version\":-1203962153,\"ui_metadata\":" + "{\"JbAaV\":{\"feature_id\":\"rIFjS\",\"feature_name\":\"QXCmS\",\"feature_enabled\":false," + "\"aggregation_query\":{\"aa\":{\"value_count\":{\"field\":\"ok\"}}}}},\"last_update_time\":1568396089028}"; - TestHelpers.assertFailWith(ADValidationException.class, () -> AnomalyDetector.parse(TestHelpers.parser(detectorString))); + TestHelpers.assertFailWith(ValidationException.class, () -> AnomalyDetector.parse(TestHelpers.parser(detectorString))); } public void testParseAnomalyDetectorWithNegativeWindowDelay() throws Exception { @@ -220,7 +214,7 @@ public void testParseAnomalyDetectorWithNegativeWindowDelay() throws Exception { + "\"unit\":\"Minutes\"}},\"shingle_size\":4,\"schema_version\":-1203962153,\"ui_metadata\":{\"JbAaV\":{\"feature_id\":" + "\"rIFjS\",\"feature_name\":\"QXCmS\",\"feature_enabled\":false,\"aggregation_query\":{\"aa\":" + "{\"value_count\":{\"field\":\"ok\"}}}}},\"last_update_time\":1568396089028}"; - TestHelpers.assertFailWith(ADValidationException.class, () -> AnomalyDetector.parse(TestHelpers.parser(detectorString))); + TestHelpers.assertFailWith(ValidationException.class, () -> AnomalyDetector.parse(TestHelpers.parser(detectorString))); } public void testParseAnomalyDetectorWithNegativeDetectionInterval() throws Exception { @@ -232,7 +226,7 @@ public void testParseAnomalyDetectorWithNegativeDetectionInterval() throws Excep + "\"unit\":\"Minutes\"}},\"shingle_size\":4,\"schema_version\":-1203962153,\"ui_metadata\":{\"JbAaV\":{\"feature_id\":" + "\"rIFjS\",\"feature_name\":\"QXCmS\",\"feature_enabled\":false,\"aggregation_query\":{\"aa\":" + "{\"value_count\":{\"field\":\"ok\"}}}}},\"last_update_time\":1568396089028}"; - TestHelpers.assertFailWith(ADValidationException.class, () -> AnomalyDetector.parse(TestHelpers.parser(detectorString))); + TestHelpers.assertFailWith(ValidationException.class, () -> AnomalyDetector.parse(TestHelpers.parser(detectorString))); } public void testParseAnomalyDetectorWithIncorrectFeatureQuery() throws Exception { @@ -244,7 +238,7 @@ public void testParseAnomalyDetectorWithIncorrectFeatureQuery() throws Exception + "\"unit\":\"Minutes\"}},\"shingle_size\":4,\"schema_version\":-1203962153,\"ui_metadata\":{\"JbAaV\":{\"feature_id\":" + "\"rIFjS\",\"feature_name\":\"QXCmS\",\"feature_enabled\":false,\"aggregation_query\":{\"aa\":" + "{\"value_count\":{\"field\":\"ok\"}}}}},\"last_update_time\":1568396089028}"; - TestHelpers.assertFailWith(ADValidationException.class, () -> AnomalyDetector.parse(TestHelpers.parser(detectorString))); + TestHelpers.assertFailWith(ValidationException.class, () -> AnomalyDetector.parse(TestHelpers.parser(detectorString))); } public void testParseAnomalyDetectorWithInvalidDetectorIntervalUnits() { @@ -261,7 +255,7 @@ public void testParseAnomalyDetectorWithInvalidDetectorIntervalUnits() { () -> AnomalyDetector.parse(TestHelpers.parser(detectorString)) ); assertEquals( - String.format(Locale.ROOT, CommonErrorMessages.INVALID_TIME_CONFIGURATION_UNITS, ChronoUnit.MILLIS), + String.format(Locale.ROOT, ADCommonMessages.INVALID_TIME_CONFIGURATION_UNITS, ChronoUnit.MILLIS), exception.getMessage() ); } @@ -280,7 +274,7 @@ public void testParseAnomalyDetectorInvalidWindowDelayUnits() { () -> AnomalyDetector.parse(TestHelpers.parser(detectorString)) ); assertEquals( - String.format(Locale.ROOT, CommonErrorMessages.INVALID_TIME_CONFIGURATION_UNITS, ChronoUnit.MILLIS), + String.format(Locale.ROOT, ADCommonMessages.INVALID_TIME_CONFIGURATION_UNITS, ChronoUnit.MILLIS), exception.getMessage() ); } @@ -303,7 +297,7 @@ public void testParseAnomalyDetectorWithEmptyUiMetadata() throws IOException { public void testInvalidShingleSize() throws Exception { TestHelpers .assertFailWith( - ADValidationException.class, + ValidationException.class, () -> new AnomalyDetector( randomAlphaOfLength(5), randomLong(), @@ -321,7 +315,8 @@ public void testInvalidShingleSize() throws Exception { Instant.now(), null, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ) ); } @@ -329,7 +324,7 @@ public void testInvalidShingleSize() throws Exception { public void testNullDetectorName() throws Exception { TestHelpers .assertFailWith( - ADValidationException.class, + ValidationException.class, () -> new AnomalyDetector( randomAlphaOfLength(5), randomLong(), @@ -341,13 +336,14 @@ public void testNullDetectorName() throws Exception { TestHelpers.randomQuery(), TestHelpers.randomIntervalTimeConfiguration(), TestHelpers.randomIntervalTimeConfiguration(), - AnomalyDetectorSettings.DEFAULT_SHINGLE_SIZE, + TimeSeriesSettings.DEFAULT_SHINGLE_SIZE, null, 1, Instant.now(), null, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ) ); } @@ -355,7 +351,7 @@ public void testNullDetectorName() throws Exception { public void testBlankDetectorName() throws Exception { TestHelpers .assertFailWith( - ADValidationException.class, + ValidationException.class, () -> new AnomalyDetector( randomAlphaOfLength(5), randomLong(), @@ -367,13 +363,14 @@ public void testBlankDetectorName() throws Exception { TestHelpers.randomQuery(), TestHelpers.randomIntervalTimeConfiguration(), TestHelpers.randomIntervalTimeConfiguration(), - AnomalyDetectorSettings.DEFAULT_SHINGLE_SIZE, + TimeSeriesSettings.DEFAULT_SHINGLE_SIZE, null, 1, Instant.now(), null, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ) ); } @@ -381,7 +378,7 @@ public void testBlankDetectorName() throws Exception { public void testNullTimeField() throws Exception { TestHelpers .assertFailWith( - ADValidationException.class, + ValidationException.class, () -> new AnomalyDetector( randomAlphaOfLength(5), randomLong(), @@ -393,13 +390,14 @@ public void testNullTimeField() throws Exception { TestHelpers.randomQuery(), TestHelpers.randomIntervalTimeConfiguration(), TestHelpers.randomIntervalTimeConfiguration(), - AnomalyDetectorSettings.DEFAULT_SHINGLE_SIZE, + TimeSeriesSettings.DEFAULT_SHINGLE_SIZE, null, 1, Instant.now(), null, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ) ); } @@ -407,7 +405,7 @@ public void testNullTimeField() throws Exception { public void testNullIndices() throws Exception { TestHelpers .assertFailWith( - ADValidationException.class, + ValidationException.class, () -> new AnomalyDetector( randomAlphaOfLength(5), randomLong(), @@ -419,13 +417,14 @@ public void testNullIndices() throws Exception { TestHelpers.randomQuery(), TestHelpers.randomIntervalTimeConfiguration(), TestHelpers.randomIntervalTimeConfiguration(), - AnomalyDetectorSettings.DEFAULT_SHINGLE_SIZE, + TimeSeriesSettings.DEFAULT_SHINGLE_SIZE, null, 1, Instant.now(), null, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ) ); } @@ -433,7 +432,7 @@ public void testNullIndices() throws Exception { public void testEmptyIndices() throws Exception { TestHelpers .assertFailWith( - ADValidationException.class, + ValidationException.class, () -> new AnomalyDetector( randomAlphaOfLength(5), randomLong(), @@ -445,13 +444,14 @@ public void testEmptyIndices() throws Exception { TestHelpers.randomQuery(), TestHelpers.randomIntervalTimeConfiguration(), TestHelpers.randomIntervalTimeConfiguration(), - AnomalyDetectorSettings.DEFAULT_SHINGLE_SIZE, + TimeSeriesSettings.DEFAULT_SHINGLE_SIZE, null, 1, Instant.now(), null, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ) ); } @@ -459,7 +459,7 @@ public void testEmptyIndices() throws Exception { public void testNullDetectionInterval() throws Exception { TestHelpers .assertFailWith( - ADValidationException.class, + ValidationException.class, () -> new AnomalyDetector( randomAlphaOfLength(5), randomLong(), @@ -471,20 +471,21 @@ public void testNullDetectionInterval() throws Exception { TestHelpers.randomQuery(), null, TestHelpers.randomIntervalTimeConfiguration(), - AnomalyDetectorSettings.DEFAULT_SHINGLE_SIZE, + TimeSeriesSettings.DEFAULT_SHINGLE_SIZE, null, 1, Instant.now(), null, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ) ); } public void testInvalidDetectionInterval() { - ADValidationException exception = expectThrows( - ADValidationException.class, + ValidationException exception = expectThrows( + ValidationException.class, () -> new AnomalyDetector( randomAlphaOfLength(10), randomLong(), @@ -502,7 +503,8 @@ public void testInvalidDetectionInterval() { Instant.now(), null, null, - null + null, + TestHelpers.randomImputationOption() ) ); assertEquals("Detection interval must be a positive integer", exception.getMessage()); @@ -528,7 +530,8 @@ public void testInvalidWindowDelay() { Instant.now(), null, null, - null + null, + TestHelpers.randomImputationOption() ) ); assertEquals("Interval -1 should be non-negative", exception.getMessage()); @@ -567,7 +570,8 @@ public void testGetShingleSize() throws IOException { Instant.now(), null, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ); assertEquals((int) anomalyDetector.getShingleSize(), 5); } @@ -590,9 +594,10 @@ public void testGetShingleSizeReturnsDefaultValue() throws IOException { Instant.now(), null, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ); - assertEquals((int) anomalyDetector.getShingleSize(), AnomalyDetectorSettings.DEFAULT_SHINGLE_SIZE); + assertEquals((int) anomalyDetector.getShingleSize(), TimeSeriesSettings.DEFAULT_SHINGLE_SIZE); } public void testNullFeatureAttributes() throws IOException { @@ -613,27 +618,49 @@ public void testNullFeatureAttributes() throws IOException { Instant.now(), null, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ); assertNotNull(anomalyDetector.getFeatureAttributes()); assertEquals(0, anomalyDetector.getFeatureAttributes().size()); } - public void testValidateResultIndex() { - String errorMessage = AnomalyDetector.validateResultIndex("abc"); + public void testValidateResultIndex() throws IOException { + AnomalyDetector anomalyDetector = new AnomalyDetector( + randomAlphaOfLength(5), + randomLong(), + randomAlphaOfLength(5), + randomAlphaOfLength(5), + randomAlphaOfLength(5), + ImmutableList.of(randomAlphaOfLength(5)), + ImmutableList.of(TestHelpers.randomFeature()), + TestHelpers.randomQuery(), + TestHelpers.randomIntervalTimeConfiguration(), + TestHelpers.randomIntervalTimeConfiguration(), + null, + null, + 1, + Instant.now(), + null, + TestHelpers.randomUser(), + null, + TestHelpers.randomImputationOption() + ); + + String errorMessage = anomalyDetector.validateCustomResultIndex("abc"); assertEquals(INVALID_RESULT_INDEX_PREFIX, errorMessage); StringBuilder resultIndexNameBuilder = new StringBuilder(CUSTOM_RESULT_INDEX_PREFIX); for (int i = 0; i < MAX_RESULT_INDEX_NAME_SIZE - CUSTOM_RESULT_INDEX_PREFIX.length(); i++) { resultIndexNameBuilder.append("a"); } - assertNull(AnomalyDetector.validateResultIndex(resultIndexNameBuilder.toString())); + assertNull(anomalyDetector.validateCustomResultIndex(resultIndexNameBuilder.toString())); resultIndexNameBuilder.append("a"); - errorMessage = AnomalyDetector.validateResultIndex(resultIndexNameBuilder.toString()); - assertEquals(INVALID_RESULT_INDEX_NAME_SIZE, errorMessage); + errorMessage = anomalyDetector.validateCustomResultIndex(resultIndexNameBuilder.toString()); + assertEquals(AnomalyDetector.INVALID_RESULT_INDEX_NAME_SIZE, errorMessage); - errorMessage = AnomalyDetector.validateResultIndex(CUSTOM_RESULT_INDEX_PREFIX + "abc#"); + errorMessage = anomalyDetector.validateCustomResultIndex(CUSTOM_RESULT_INDEX_PREFIX + "abc#"); assertEquals(INVALID_CHAR_IN_RESULT_INDEX_NAME, errorMessage); } diff --git a/src/test/java/org/opensearch/ad/model/AnomalyResultBucketTests.java b/src/test/java/org/opensearch/ad/model/AnomalyResultBucketTests.java index 05e6a816d..ec2daee06 100644 --- a/src/test/java/org/opensearch/ad/model/AnomalyResultBucketTests.java +++ b/src/test/java/org/opensearch/ad/model/AnomalyResultBucketTests.java @@ -11,16 +11,16 @@ import java.util.HashMap; import java.util.Map; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.TestHelpers; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.common.xcontent.XContentFactory; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.xcontent.ToXContent; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.TestHelpers; -public class AnomalyResultBucketTests extends AbstractADTest { +public class AnomalyResultBucketTests extends AbstractTimeSeriesTest { public void testSerializeAnomalyResultBucket() throws IOException { AnomalyResultBucket anomalyResultBucket = TestHelpers.randomAnomalyResultBucket(); diff --git a/src/test/java/org/opensearch/ad/model/AnomalyResultTests.java b/src/test/java/org/opensearch/ad/model/AnomalyResultTests.java index 20299ae0b..424de19da 100644 --- a/src/test/java/org/opensearch/ad/model/AnomalyResultTests.java +++ b/src/test/java/org/opensearch/ad/model/AnomalyResultTests.java @@ -17,8 +17,6 @@ import java.util.Collection; import java.util.Locale; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.TestHelpers; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; import org.opensearch.core.common.io.stream.NamedWriteableRegistry; @@ -26,6 +24,8 @@ import org.opensearch.plugins.Plugin; import org.opensearch.test.InternalSettingsPlugin; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; import com.google.common.base.Objects; @@ -33,7 +33,7 @@ public class AnomalyResultTests extends OpenSearchSingleNodeTestCase { @Override protected Collection> getPlugins() { - return pluginList(InternalSettingsPlugin.class, AnomalyDetectorPlugin.class); + return pluginList(InternalSettingsPlugin.class, TimeSeriesAnalyticsPlugin.class); } @Override @@ -70,7 +70,7 @@ public void testParseAnomalyDetectorWithoutNormalResult() throws IOException { .replaceFirst("\\{", String.format(Locale.ROOT, "{\"%s\":\"%s\",", randomAlphaOfLength(5), randomAlphaOfLength(5))); AnomalyResult parsedDetectResult = AnomalyResult.parse(TestHelpers.parser(detectResultString)); assertTrue( - Objects.equal(detectResult.getDetectorId(), parsedDetectResult.getDetectorId()) + Objects.equal(detectResult.getConfigId(), parsedDetectResult.getConfigId()) && Objects.equal(detectResult.getTaskId(), parsedDetectResult.getTaskId()) && Objects.equal(detectResult.getAnomalyScore(), parsedDetectResult.getAnomalyScore()) && Objects.equal(detectResult.getAnomalyGrade(), parsedDetectResult.getAnomalyGrade()) @@ -95,7 +95,7 @@ public void testParseAnomalyDetectorWithNanAnomalyResult() throws IOException { assertNull(parsedDetectResult.getAnomalyGrade()); assertNull(parsedDetectResult.getAnomalyScore()); assertTrue( - Objects.equal(detectResult.getDetectorId(), parsedDetectResult.getDetectorId()) + Objects.equal(detectResult.getConfigId(), parsedDetectResult.getConfigId()) && Objects.equal(detectResult.getTaskId(), parsedDetectResult.getTaskId()) && Objects.equal(detectResult.getFeatureData(), parsedDetectResult.getFeatureData()) && Objects.equal(detectResult.getDataStartTime(), parsedDetectResult.getDataStartTime()) diff --git a/src/test/java/org/opensearch/ad/model/DetectionDateRangeTests.java b/src/test/java/org/opensearch/ad/model/DetectionDateRangeTests.java index e33c22104..ab507b027 100644 --- a/src/test/java/org/opensearch/ad/model/DetectionDateRangeTests.java +++ b/src/test/java/org/opensearch/ad/model/DetectionDateRangeTests.java @@ -17,8 +17,6 @@ import java.util.Collection; import java.util.Locale; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.TestHelpers; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; import org.opensearch.core.common.io.stream.NamedWriteableRegistry; @@ -26,12 +24,15 @@ import org.opensearch.plugins.Plugin; import org.opensearch.test.InternalSettingsPlugin; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.model.DateRange; public class DetectionDateRangeTests extends OpenSearchSingleNodeTestCase { @Override protected Collection> getPlugins() { - return pluginList(InternalSettingsPlugin.class, AnomalyDetectorPlugin.class); + return pluginList(InternalSettingsPlugin.class, TimeSeriesAnalyticsPlugin.class); } @Override @@ -40,44 +41,38 @@ protected NamedWriteableRegistry writableRegistry() { } public void testParseDetectionDateRangeWithNullStartTime() { - IllegalArgumentException exception = expectThrows( - IllegalArgumentException.class, - () -> new DetectionDateRange(null, Instant.now()) - ); + IllegalArgumentException exception = expectThrows(IllegalArgumentException.class, () -> new DateRange(null, Instant.now())); assertEquals("Detection data range's start time must not be null", exception.getMessage()); } public void testParseDetectionDateRangeWithNullEndTime() { - IllegalArgumentException exception = expectThrows( - IllegalArgumentException.class, - () -> new DetectionDateRange(Instant.now(), null) - ); + IllegalArgumentException exception = expectThrows(IllegalArgumentException.class, () -> new DateRange(Instant.now(), null)); assertEquals("Detection data range's end time must not be null", exception.getMessage()); } public void testInvalidDateRange() { IllegalArgumentException exception = expectThrows( IllegalArgumentException.class, - () -> new DetectionDateRange(Instant.now(), Instant.now().minus(10, ChronoUnit.MINUTES)) + () -> new DateRange(Instant.now(), Instant.now().minus(10, ChronoUnit.MINUTES)) ); assertEquals("Detection data range's end time must be after start time", exception.getMessage()); } public void testSerializeDetectoinDateRange() throws IOException { - DetectionDateRange dateRange = TestHelpers.randomDetectionDateRange(); + DateRange dateRange = TestHelpers.randomDetectionDateRange(); BytesStreamOutput output = new BytesStreamOutput(); dateRange.writeTo(output); NamedWriteableAwareStreamInput input = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); - DetectionDateRange parsedDateRange = new DetectionDateRange(input); + DateRange parsedDateRange = new DateRange(input); assertTrue(parsedDateRange.equals(dateRange)); } public void testParseDetectionDateRange() throws IOException { - DetectionDateRange dateRange = TestHelpers.randomDetectionDateRange(); + DateRange dateRange = TestHelpers.randomDetectionDateRange(); String dateRangeString = TestHelpers.xContentBuilderToString(dateRange.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); dateRangeString = dateRangeString .replaceFirst("\\{", String.format(Locale.ROOT, "{\"%s\":\"%s\",", randomAlphaOfLength(5), randomAlphaOfLength(5))); - DetectionDateRange parsedDateRange = DetectionDateRange.parse(TestHelpers.parser(dateRangeString)); + DateRange parsedDateRange = DateRange.parse(TestHelpers.parser(dateRangeString)); assertEquals("Parsing detection range doesn't work", dateRange, parsedDateRange); } diff --git a/src/test/java/org/opensearch/ad/model/DetectorInternalStateTests.java b/src/test/java/org/opensearch/ad/model/DetectorInternalStateTests.java index e19cd2b27..2ea993b72 100644 --- a/src/test/java/org/opensearch/ad/model/DetectorInternalStateTests.java +++ b/src/test/java/org/opensearch/ad/model/DetectorInternalStateTests.java @@ -8,9 +8,9 @@ import java.io.IOException; import java.time.Instant; -import org.opensearch.ad.TestHelpers; import org.opensearch.core.xcontent.ToXContent; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; public class DetectorInternalStateTests extends OpenSearchSingleNodeTestCase { diff --git a/src/test/java/org/opensearch/ad/model/DetectorProfileTests.java b/src/test/java/org/opensearch/ad/model/DetectorProfileTests.java index e87740e56..9960a5fe2 100644 --- a/src/test/java/org/opensearch/ad/model/DetectorProfileTests.java +++ b/src/test/java/org/opensearch/ad/model/DetectorProfileTests.java @@ -14,13 +14,14 @@ import java.io.IOException; import java.util.Map; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; import org.opensearch.core.xcontent.XContentParser; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.Entity; public class DetectorProfileTests extends OpenSearchTestCase { @@ -88,18 +89,18 @@ public void testDetectorProfileToXContent() throws IOException { } public void testDetectorProfileName() throws IllegalArgumentException { - assertEquals("ad_task", DetectorProfileName.getName(CommonName.AD_TASK).getName()); - assertEquals("state", DetectorProfileName.getName(CommonName.STATE).getName()); - assertEquals("error", DetectorProfileName.getName(CommonName.ERROR).getName()); - assertEquals("coordinating_node", DetectorProfileName.getName(CommonName.COORDINATING_NODE).getName()); - assertEquals("shingle_size", DetectorProfileName.getName(CommonName.SHINGLE_SIZE).getName()); - assertEquals("total_size_in_bytes", DetectorProfileName.getName(CommonName.TOTAL_SIZE_IN_BYTES).getName()); - assertEquals("models", DetectorProfileName.getName(CommonName.MODELS).getName()); - assertEquals("init_progress", DetectorProfileName.getName(CommonName.INIT_PROGRESS).getName()); - assertEquals("total_entities", DetectorProfileName.getName(CommonName.TOTAL_ENTITIES).getName()); - assertEquals("active_entities", DetectorProfileName.getName(CommonName.ACTIVE_ENTITIES).getName()); + assertEquals("ad_task", DetectorProfileName.getName(ADCommonName.AD_TASK).getName()); + assertEquals("state", DetectorProfileName.getName(ADCommonName.STATE).getName()); + assertEquals("error", DetectorProfileName.getName(ADCommonName.ERROR).getName()); + assertEquals("coordinating_node", DetectorProfileName.getName(ADCommonName.COORDINATING_NODE).getName()); + assertEquals("shingle_size", DetectorProfileName.getName(ADCommonName.SHINGLE_SIZE).getName()); + assertEquals("total_size_in_bytes", DetectorProfileName.getName(ADCommonName.TOTAL_SIZE_IN_BYTES).getName()); + assertEquals("models", DetectorProfileName.getName(ADCommonName.MODELS).getName()); + assertEquals("init_progress", DetectorProfileName.getName(ADCommonName.INIT_PROGRESS).getName()); + assertEquals("total_entities", DetectorProfileName.getName(ADCommonName.TOTAL_ENTITIES).getName()); + assertEquals("active_entities", DetectorProfileName.getName(ADCommonName.ACTIVE_ENTITIES).getName()); IllegalArgumentException exception = expectThrows(IllegalArgumentException.class, () -> DetectorProfileName.getName("abc")); - assertEquals(exception.getMessage(), CommonErrorMessages.UNSUPPORTED_PROFILE_TYPE); + assertEquals(exception.getMessage(), ADCommonMessages.UNSUPPORTED_PROFILE_TYPE); } public void testDetectorProfileSet() throws IllegalArgumentException { diff --git a/src/test/java/org/opensearch/ad/model/EntityAnomalyResultTests.java b/src/test/java/org/opensearch/ad/model/EntityAnomalyResultTests.java index 2713e2c98..24cb0c879 100644 --- a/src/test/java/org/opensearch/ad/model/EntityAnomalyResultTests.java +++ b/src/test/java/org/opensearch/ad/model/EntityAnomalyResultTests.java @@ -12,7 +12,7 @@ package org.opensearch.ad.model; import static java.util.Arrays.asList; -import static org.opensearch.ad.TestHelpers.randomHCADAnomalyDetectResult; +import static org.opensearch.timeseries.TestHelpers.randomHCADAnomalyDetectResult; import java.util.ArrayList; import java.util.List; diff --git a/src/test/java/org/opensearch/ad/model/EntityProfileTests.java b/src/test/java/org/opensearch/ad/model/EntityProfileTests.java index 09bfb16ff..18e179145 100644 --- a/src/test/java/org/opensearch/ad/model/EntityProfileTests.java +++ b/src/test/java/org/opensearch/ad/model/EntityProfileTests.java @@ -15,15 +15,15 @@ import java.io.IOException; -import org.opensearch.ad.AbstractADTest; import org.opensearch.ad.common.exception.JsonPathNotFoundException; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.core.xcontent.ToXContent; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.AbstractTimeSeriesTest; import test.org.opensearch.ad.util.JsonDeserializer; -public class EntityProfileTests extends AbstractADTest { +public class EntityProfileTests extends AbstractTimeSeriesTest { public void testMerge() { EntityProfile profile1 = new EntityProfile(null, -1, -1, null, null, EntityState.INIT); EntityProfile profile2 = new EntityProfile(null, -1, -1, null, null, EntityState.UNKNOWN); @@ -39,7 +39,7 @@ public void testToXContent() throws IOException, JsonPathNotFoundException { profile1.toXContent(builder, ToXContent.EMPTY_PARAMS); String json = builder.toString(); - assertEquals("INIT", JsonDeserializer.getTextValue(json, CommonName.STATE)); + assertEquals("INIT", JsonDeserializer.getTextValue(json, ADCommonName.STATE)); EntityProfile profile2 = new EntityProfile(null, -1, -1, null, null, EntityState.UNKNOWN); @@ -47,7 +47,7 @@ public void testToXContent() throws IOException, JsonPathNotFoundException { profile2.toXContent(builder, ToXContent.EMPTY_PARAMS); json = builder.toString(); - assertTrue(false == JsonDeserializer.hasChildNode(json, CommonName.STATE)); + assertTrue(false == JsonDeserializer.hasChildNode(json, ADCommonName.STATE)); } public void testToXContentTimeStampAboveZero() throws IOException, JsonPathNotFoundException { @@ -57,7 +57,7 @@ public void testToXContentTimeStampAboveZero() throws IOException, JsonPathNotFo profile1.toXContent(builder, ToXContent.EMPTY_PARAMS); String json = builder.toString(); - assertEquals("INIT", JsonDeserializer.getTextValue(json, CommonName.STATE)); + assertEquals("INIT", JsonDeserializer.getTextValue(json, ADCommonName.STATE)); EntityProfile profile2 = new EntityProfile(null, 1, 1, null, null, EntityState.UNKNOWN); @@ -65,6 +65,6 @@ public void testToXContentTimeStampAboveZero() throws IOException, JsonPathNotFo profile2.toXContent(builder, ToXContent.EMPTY_PARAMS); json = builder.toString(); - assertTrue(false == JsonDeserializer.hasChildNode(json, CommonName.STATE)); + assertTrue(false == JsonDeserializer.hasChildNode(json, ADCommonName.STATE)); } } diff --git a/src/test/java/org/opensearch/ad/model/EntityTests.java b/src/test/java/org/opensearch/ad/model/EntityTests.java index f3affd6c1..7c645d920 100644 --- a/src/test/java/org/opensearch/ad/model/EntityTests.java +++ b/src/test/java/org/opensearch/ad/model/EntityTests.java @@ -15,9 +15,10 @@ import java.util.Optional; import java.util.TreeMap; -import org.opensearch.ad.AbstractADTest; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.model.Entity; -public class EntityTests extends AbstractADTest { +public class EntityTests extends AbstractTimeSeriesTest { /** * Test that toStrign has no random string, but only attributes */ diff --git a/src/test/java/org/opensearch/ad/model/FeatureDataTests.java b/src/test/java/org/opensearch/ad/model/FeatureDataTests.java index 2b53fdbb8..bfd17bb95 100644 --- a/src/test/java/org/opensearch/ad/model/FeatureDataTests.java +++ b/src/test/java/org/opensearch/ad/model/FeatureDataTests.java @@ -14,9 +14,10 @@ import java.io.IOException; import java.util.Locale; -import org.opensearch.ad.TestHelpers; import org.opensearch.core.xcontent.ToXContent; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.FeatureData; public class FeatureDataTests extends OpenSearchTestCase { diff --git a/src/test/java/org/opensearch/ad/model/FeatureTests.java b/src/test/java/org/opensearch/ad/model/FeatureTests.java index 7507764f0..bc3baafe8 100644 --- a/src/test/java/org/opensearch/ad/model/FeatureTests.java +++ b/src/test/java/org/opensearch/ad/model/FeatureTests.java @@ -14,9 +14,10 @@ import java.io.IOException; import java.util.Locale; -import org.opensearch.ad.TestHelpers; import org.opensearch.core.xcontent.ToXContent; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.Feature; public class FeatureTests extends OpenSearchTestCase { diff --git a/src/test/java/org/opensearch/ad/model/IntervalTimeConfigurationTests.java b/src/test/java/org/opensearch/ad/model/IntervalTimeConfigurationTests.java index 543bd5768..970d9fd89 100644 --- a/src/test/java/org/opensearch/ad/model/IntervalTimeConfigurationTests.java +++ b/src/test/java/org/opensearch/ad/model/IntervalTimeConfigurationTests.java @@ -16,9 +16,11 @@ import java.time.temporal.ChronoUnit; import java.util.Locale; -import org.opensearch.ad.TestHelpers; import org.opensearch.core.xcontent.ToXContent; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.TimeConfiguration; public class IntervalTimeConfigurationTests extends OpenSearchTestCase { diff --git a/src/test/java/org/opensearch/ad/model/MergeableListTests.java b/src/test/java/org/opensearch/ad/model/MergeableListTests.java index 79b3f43bc..1375d72a7 100644 --- a/src/test/java/org/opensearch/ad/model/MergeableListTests.java +++ b/src/test/java/org/opensearch/ad/model/MergeableListTests.java @@ -14,9 +14,10 @@ import java.util.ArrayList; import java.util.List; -import org.opensearch.ad.AbstractADTest; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.model.MergeableList; -public class MergeableListTests extends AbstractADTest { +public class MergeableListTests extends AbstractTimeSeriesTest { public void testMergeableListGetElements() { List ls1 = new ArrayList(); diff --git a/src/test/java/org/opensearch/ad/model/ModelProfileTests.java b/src/test/java/org/opensearch/ad/model/ModelProfileTests.java index e5e675f9a..c99ff6222 100644 --- a/src/test/java/org/opensearch/ad/model/ModelProfileTests.java +++ b/src/test/java/org/opensearch/ad/model/ModelProfileTests.java @@ -15,14 +15,15 @@ import java.io.IOException; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.constant.CommonName; import org.opensearch.core.xcontent.ToXContent; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Entity; import test.org.opensearch.ad.util.JsonDeserializer; -public class ModelProfileTests extends AbstractADTest { +public class ModelProfileTests extends AbstractTimeSeriesTest { public void testToXContent() throws IOException { ModelProfile profile1 = new ModelProfile( diff --git a/src/test/java/org/opensearch/ad/plugin/MockReindexPlugin.java b/src/test/java/org/opensearch/ad/plugin/MockReindexPlugin.java index 9a0968173..b03c3f017 100644 --- a/src/test/java/org/opensearch/ad/plugin/MockReindexPlugin.java +++ b/src/test/java/org/opensearch/ad/plugin/MockReindexPlugin.java @@ -26,8 +26,7 @@ import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.HandledTransportAction; import org.opensearch.action.support.WriteRequest; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.client.Client; import org.opensearch.common.inject.Inject; import org.opensearch.common.unit.TimeValue; @@ -44,6 +43,7 @@ import org.opensearch.plugins.Plugin; import org.opensearch.search.SearchHit; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.TestHelpers; import org.opensearch.transport.TransportService; import com.google.common.collect.ImmutableList; @@ -168,7 +168,7 @@ protected void doExecute(Task task, DeleteByQueryRequest request, ActionListener Iterator iterator = r.getHits().iterator(); while (iterator.hasNext()) { String id = iterator.next().getId(); - DeleteRequest deleteRequest = new DeleteRequest(CommonName.DETECTION_STATE_INDEX, id) + DeleteRequest deleteRequest = new DeleteRequest(ADCommonName.DETECTION_STATE_INDEX, id) .setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE); client.delete(deleteRequest, delegateListener); } diff --git a/src/test/java/org/opensearch/ad/ratelimit/AbstractRateLimitingTest.java b/src/test/java/org/opensearch/ad/ratelimit/AbstractRateLimitingTest.java index 3caf23a99..d587fb75d 100644 --- a/src/test/java/org/opensearch/ad/ratelimit/AbstractRateLimitingTest.java +++ b/src/test/java/org/opensearch/ad/ratelimit/AbstractRateLimitingTest.java @@ -12,6 +12,7 @@ package org.opensearch.ad.ratelimit; import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.eq; import static org.mockito.Mockito.doAnswer; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; @@ -21,15 +22,16 @@ import java.util.Arrays; import java.util.Optional; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; import org.opensearch.core.action.ActionListener; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.Entity; -public class AbstractRateLimitingTest extends AbstractADTest { +public class AbstractRateLimitingTest extends AbstractTimeSeriesTest { Clock clock; AnomalyDetector detector; NodeStateManager nodeStateManager; @@ -54,10 +56,10 @@ public void setUp() throws Exception { nodeStateManager = mock(NodeStateManager.class); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); entity = Entity.createSingleAttributeEntity(categoryField, "value"); entity2 = Entity.createSingleAttributeEntity(categoryField, "value2"); diff --git a/src/test/java/org/opensearch/ad/ratelimit/CheckPointMaintainRequestAdapterTests.java b/src/test/java/org/opensearch/ad/ratelimit/CheckPointMaintainRequestAdapterTests.java index 6a223fd0e..830ac3f65 100644 --- a/src/test/java/org/opensearch/ad/ratelimit/CheckPointMaintainRequestAdapterTests.java +++ b/src/test/java/org/opensearch/ad/ratelimit/CheckPointMaintainRequestAdapterTests.java @@ -29,7 +29,7 @@ import org.opensearch.action.update.UpdateRequest; import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.caching.EntityCache; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.ml.CheckpointDao; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelState; @@ -58,8 +58,8 @@ public void setUp() throws Exception { super.setUp(); cache = mock(CacheProvider.class); checkpointDao = mock(CheckpointDao.class); - indexName = CommonName.CHECKPOINT_INDEX_NAME; - checkpointInterval = AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ; + indexName = ADCommonName.CHECKPOINT_INDEX_NAME; + checkpointInterval = AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ; EntityCache entityCache = mock(EntityCache.class); when(cache.get()).thenReturn(entityCache); state = MLUtil.randomModelState(new RandomModelStateConfig.Builder().fullModel(true).build()); @@ -67,7 +67,7 @@ public void setUp() throws Exception { clusterService = mock(ClusterService.class); ClusterSettings settings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ))) ); when(clusterService.getClusterSettings()).thenReturn(settings); adapter = new CheckPointMaintainRequestAdapter( diff --git a/src/test/java/org/opensearch/ad/ratelimit/CheckpointMaintainWorkerTests.java b/src/test/java/org/opensearch/ad/ratelimit/CheckpointMaintainWorkerTests.java index 22e104ac2..0d05259fc 100644 --- a/src/test/java/org/opensearch/ad/ratelimit/CheckpointMaintainWorkerTests.java +++ b/src/test/java/org/opensearch/ad/ratelimit/CheckpointMaintainWorkerTests.java @@ -32,10 +32,9 @@ import java.util.Optional; import java.util.Random; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.caching.EntityCache; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.ml.CheckpointDao; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelState; @@ -45,6 +44,8 @@ import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.common.unit.TimeValue; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import test.org.opensearch.ad.util.MLUtil; import test.org.opensearch.ad.util.RandomModelStateConfig; @@ -62,7 +63,7 @@ public class CheckpointMaintainWorkerTests extends AbstractRateLimitingTest { public void setUp() throws Exception { super.setUp(); clusterService = mock(ClusterService.class); - Settings settings = Settings.builder().put(AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_BATCH_SIZE.getKey(), 1).build(); + Settings settings = Settings.builder().put(AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE.getKey(), 1).build(); ClusterSettings clusterSettings = new ClusterSettings( settings, Collections @@ -70,10 +71,10 @@ public void setUp() throws Exception { new HashSet<>( Arrays .asList( - AnomalyDetectorSettings.EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS, - AnomalyDetectorSettings.CHECKPOINT_MAINTAIN_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_BATCH_SIZE, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ + AnomalyDetectorSettings.AD_EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS, + AnomalyDetectorSettings.AD_CHECKPOINT_MAINTAIN_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE, + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ ) ) ) @@ -84,8 +85,8 @@ public void setUp() throws Exception { CacheProvider cache = mock(CacheProvider.class); checkpointDao = mock(CheckpointDao.class); - String indexName = CommonName.CHECKPOINT_INDEX_NAME; - Setting checkpointInterval = AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ; + String indexName = ADCommonName.CHECKPOINT_INDEX_NAME; + Setting checkpointInterval = AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ; EntityCache entityCache = mock(EntityCache.class); when(cache.get()).thenReturn(entityCache); ModelState state = MLUtil.randomModelState(new RandomModelStateConfig.Builder().fullModel(true).build()); @@ -104,19 +105,19 @@ public void setUp() throws Exception { cpMaintainWorker = new CheckpointMaintainWorker( Integer.MAX_VALUE, AnomalyDetectorSettings.ENTITY_FEATURE_REQUEST_SIZE_IN_BYTES, - AnomalyDetectorSettings.CHECKPOINT_MAINTAIN_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_MAINTAIN_QUEUE_MAX_HEAP_PERCENT, clusterService, new Random(42), - mock(ADCircuitBreakerService.class), + mock(CircuitBreakerService.class), threadPool, settings, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, clock, - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, writeWorker, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, nodeStateManager, adapter ); @@ -134,7 +135,7 @@ public void setUp() throws Exception { TimeValue value = invocation.getArgument(1); // since we have only 1 request each time - long expectedExecutionPerRequestMilli = AnomalyDetectorSettings.EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS + long expectedExecutionPerRequestMilli = AnomalyDetectorSettings.AD_EXPECTED_CHECKPOINT_MAINTAIN_TIME_IN_MILLISECS .getDefault(Settings.EMPTY); long delay = value.getMillis(); assertTrue(delay == expectedExecutionPerRequestMilli); diff --git a/src/test/java/org/opensearch/ad/ratelimit/CheckpointReadWorkerTests.java b/src/test/java/org/opensearch/ad/ratelimit/CheckpointReadWorkerTests.java index 9512aee98..41b8035b0 100644 --- a/src/test/java/org/opensearch/ad/ratelimit/CheckpointReadWorkerTests.java +++ b/src/test/java/org/opensearch/ad/ratelimit/CheckpointReadWorkerTests.java @@ -45,25 +45,19 @@ import org.opensearch.action.get.GetResponse; import org.opensearch.action.get.MultiGetItemResponse; import org.opensearch.action.get.MultiGetResponse; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.caching.EntityCache; -import org.opensearch.ad.common.exception.LimitExceededException; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.ml.CheckpointDao; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelManager; import org.opensearch.ad.ml.ModelState; import org.opensearch.ad.ml.ThresholdingResult; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.stats.ADStat; import org.opensearch.ad.stats.ADStats; -import org.opensearch.ad.stats.StatNames; import org.opensearch.ad.stats.suppliers.CounterSupplier; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.ClusterSettings; @@ -76,6 +70,14 @@ import org.opensearch.index.seqno.SequenceNumbers; import org.opensearch.threadpool.ThreadPoolStats; import org.opensearch.threadpool.ThreadPoolStats.Stats; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.stats.StatNames; import com.fasterxml.jackson.core.JsonParseException; @@ -94,7 +96,7 @@ public class CheckpointReadWorkerTests extends AbstractRateLimitingTest { ModelManager modelManager; EntityColdStartWorker coldstartQueue; ResultWriteWorker resultWriteQueue; - AnomalyDetectionIndices anomalyDetectionIndices; + ADIndexManagement anomalyDetectionIndices; CacheProvider cacheProvider; EntityCache entityCache; EntityFeatureRequest request, request2, request3; @@ -112,9 +114,9 @@ public void setUp() throws Exception { new HashSet<>( Arrays .asList( - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_CONCURRENCY, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_BATCH_SIZE + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_CONCURRENCY, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_BATCH_SIZE ) ) ) @@ -136,7 +138,7 @@ public void setUp() throws Exception { coldstartQueue = mock(EntityColdStartWorker.class); resultWriteQueue = mock(ResultWriteWorker.class); - anomalyDetectionIndices = mock(AnomalyDetectionIndices.class); + anomalyDetectionIndices = mock(ADIndexManagement.class); cacheProvider = mock(CacheProvider.class); entityCache = mock(EntityCache.class); @@ -155,18 +157,18 @@ public void setUp() throws Exception { worker = new CheckpointReadWorker( Integer.MAX_VALUE, AnomalyDetectorSettings.ENTITY_FEATURE_REQUEST_SIZE_IN_BYTES, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, clusterService, new Random(42), - mock(ADCircuitBreakerService.class), + mock(CircuitBreakerService.class), threadPool, Settings.EMPTY, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, clock, - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, - AnomalyDetectorSettings.QUEUE_MAINTENANCE, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.QUEUE_MAINTENANCE, modelManager, checkpoint, coldstartQueue, @@ -174,7 +176,7 @@ public void setUp() throws Exception { nodeStateManager, anomalyDetectionIndices, cacheProvider, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, checkpointWriteQueue, adStats ); @@ -218,7 +220,7 @@ private void regularTestSetUp(RegularSetUpConfig config) { MultiGetItemResponse[] items = new MultiGetItemResponse[1]; items[0] = new MultiGetItemResponse( new GetResponse( - new GetResult(CommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), 1, 1, 0, true, null, null, null) + new GetResult(ADCommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), 1, 1, 0, true, null, null, null) ), null ); @@ -270,9 +272,9 @@ public void testIndexNotFound() { items[0] = new MultiGetItemResponse( null, new MultiGetResponse.Failure( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), - new IndexNotFoundException(CommonName.CHECKPOINT_INDEX_NAME) + new IndexNotFoundException(ADCommonName.CHECKPOINT_INDEX_NAME) ) ); ActionListener listener = invocation.getArgument(1); @@ -291,7 +293,7 @@ public void testAllDocNotFound() { items[0] = new MultiGetItemResponse( new GetResponse( new GetResult( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), SequenceNumbers.UNASSIGNED_SEQ_NO, SequenceNumbers.UNASSIGNED_PRIMARY_TERM, @@ -307,7 +309,7 @@ public void testAllDocNotFound() { items[1] = new MultiGetItemResponse( new GetResponse( new GetResult( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, entity2.getModelId(detectorId).get(), SequenceNumbers.UNASSIGNED_SEQ_NO, SequenceNumbers.UNASSIGNED_PRIMARY_TERM, @@ -339,14 +341,14 @@ public void testSingleDocNotFound() { MultiGetItemResponse[] items = new MultiGetItemResponse[2]; items[0] = new MultiGetItemResponse( new GetResponse( - new GetResult(CommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), 1, 1, 0, true, null, null, null) + new GetResult(ADCommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), 1, 1, 0, true, null, null, null) ), null ); items[1] = new MultiGetItemResponse( new GetResponse( new GetResult( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, entity2.getModelId(detectorId).get(), SequenceNumbers.UNASSIGNED_SEQ_NO, SequenceNumbers.UNASSIGNED_PRIMARY_TERM, @@ -380,7 +382,7 @@ public void testTimeout() { items[0] = new MultiGetItemResponse( null, new MultiGetResponse.Failure( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), new OpenSearchStatusException("blah", RestStatus.REQUEST_TIMEOUT) ) @@ -388,7 +390,7 @@ public void testTimeout() { items[1] = new MultiGetItemResponse( null, new MultiGetResponse.Failure( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, entity2.getModelId(detectorId).get(), new OpenSearchStatusException("blah", RestStatus.CONFLICT) ) @@ -398,7 +400,7 @@ public void testTimeout() { items[0] = new MultiGetItemResponse( new GetResponse( new GetResult( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), 1, 1, @@ -414,7 +416,7 @@ public void testTimeout() { items[1] = new MultiGetItemResponse( new GetResponse( new GetResult( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, entity2.getModelId(detectorId).get(), 1, 1, @@ -450,7 +452,7 @@ public void testOverloadedExceptionFromResponse() { items[0] = new MultiGetItemResponse( null, new MultiGetResponse.Failure( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), new OpenSearchRejectedExecutionException("blah") ) @@ -489,7 +491,7 @@ public void testUnexpectedException() { items[0] = new MultiGetItemResponse( null, new MultiGetResponse.Failure( - CommonName.CHECKPOINT_INDEX_NAME, + ADCommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), new IllegalArgumentException("blah") ) @@ -529,23 +531,23 @@ public void testRetryableException() { public void testRemoveUnusedQueues() { // do nothing when putting a request to keep queues not empty ExecutorService executorService = mock(ExecutorService.class); - when(threadPool.executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); + when(threadPool.executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); worker = new CheckpointReadWorker( Integer.MAX_VALUE, AnomalyDetectorSettings.ENTITY_FEATURE_REQUEST_SIZE_IN_BYTES, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, clusterService, new Random(42), - mock(ADCircuitBreakerService.class), + mock(CircuitBreakerService.class), threadPool, Settings.EMPTY, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, clock, - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, - AnomalyDetectorSettings.QUEUE_MAINTENANCE, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.QUEUE_MAINTENANCE, modelManager, checkpoint, coldstartQueue, @@ -553,7 +555,7 @@ public void testRemoveUnusedQueues() { nodeStateManager, anomalyDetectionIndices, cacheProvider, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, checkpointWriteQueue, adStats ); @@ -564,7 +566,7 @@ public void testRemoveUnusedQueues() { assertEquals(CheckpointReadWorker.WORKER_NAME, worker.getWorkerName()); // make RequestQueue.expired return true - when(clock.instant()).thenReturn(Instant.now().plusSeconds(AnomalyDetectorSettings.HOURLY_MAINTENANCE.getSeconds() + 1)); + when(clock.instant()).thenReturn(Instant.now().plusSeconds(TimeSeriesSettings.HOURLY_MAINTENANCE.getSeconds() + 1)); // removed the expired queue worker.maintenance(); @@ -575,7 +577,7 @@ public void testRemoveUnusedQueues() { private void maintenanceSetup() { // do nothing when putting a request to keep queues not empty ExecutorService executorService = mock(ExecutorService.class); - when(threadPool.executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); + when(threadPool.executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); when(threadPool.stats()).thenReturn(new ThreadPoolStats(new ArrayList())); } @@ -586,18 +588,18 @@ public void testSettingUpdatable() { worker = new CheckpointReadWorker( 2000, 1, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, clusterService, new Random(42), - mock(ADCircuitBreakerService.class), + mock(CircuitBreakerService.class), threadPool, Settings.EMPTY, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, clock, - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, - AnomalyDetectorSettings.QUEUE_MAINTENANCE, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.QUEUE_MAINTENANCE, modelManager, checkpoint, coldstartQueue, @@ -605,7 +607,7 @@ public void testSettingUpdatable() { nodeStateManager, anomalyDetectionIndices, cacheProvider, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, checkpointWriteQueue, adStats ); @@ -620,7 +622,7 @@ public void testSettingUpdatable() { Settings newSettings = Settings .builder() - .put(AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT.getKey(), "0.0001") + .put(AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT.getKey(), "0.0001") .build(); Settings.Builder target = Settings.builder(); clusterSettings.updateDynamicSettings(newSettings, target, Settings.builder(), "test"); @@ -633,24 +635,24 @@ public void testSettingUpdatable() { public void testOpenCircuitBreaker() { maintenanceSetup(); - ADCircuitBreakerService breaker = mock(ADCircuitBreakerService.class); + CircuitBreakerService breaker = mock(CircuitBreakerService.class); when(breaker.isOpen()).thenReturn(true); worker = new CheckpointReadWorker( Integer.MAX_VALUE, AnomalyDetectorSettings.ENTITY_FEATURE_REQUEST_SIZE_IN_BYTES, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, clusterService, new Random(42), breaker, threadPool, Settings.EMPTY, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, clock, - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, - AnomalyDetectorSettings.QUEUE_MAINTENANCE, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.QUEUE_MAINTENANCE, modelManager, checkpoint, coldstartQueue, @@ -658,7 +660,7 @@ public void testOpenCircuitBreaker() { nodeStateManager, anomalyDetectionIndices, cacheProvider, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, checkpointWriteQueue, adStats ); @@ -673,7 +675,7 @@ public void testOpenCircuitBreaker() { assertTrue(!worker.isQueueEmpty()); // one request per batch - Settings newSettings = Settings.builder().put(AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_BATCH_SIZE.getKey(), "1").build(); + Settings newSettings = Settings.builder().put(AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_BATCH_SIZE.getKey(), "1").build(); Settings.Builder target = Settings.builder(); clusterSettings.updateDynamicSettings(newSettings, target, Settings.builder(), "test"); clusterSettings.applySettings(target.build()); @@ -686,7 +688,7 @@ public void testOpenCircuitBreaker() { MultiGetItemResponse[] items = new MultiGetItemResponse[1]; items[0] = new MultiGetItemResponse( new GetResponse( - new GetResult(CommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), 1, 1, 0, true, null, null, null) + new GetResult(ADCommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), 1, 1, 0, true, null, null, null) ), null ); @@ -711,10 +713,10 @@ public void testChangePriority() { } public void testDetectorId() { - assertEquals(detectorId, request.getDetectorId()); + assertEquals(detectorId, request.getId()); String newDetectorId = "456"; request.setDetectorId(newDetectorId); - assertEquals(newDetectorId, request.getDetectorId()); + assertEquals(newDetectorId, request.getId()); } @SuppressWarnings("unchecked") @@ -733,28 +735,38 @@ public void testHostException() throws IOException { AnomalyDetector detector2 = TestHelpers.randomAnomalyDetectorUsingCategoryFields(detectorId2, Arrays.asList(categoryField)); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector2)); return null; - }).when(nodeStateManager).getAnomalyDetector(eq(detectorId2), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(eq(detectorId2), eq(AnalysisType.AD), any(ActionListener.class)); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(nodeStateManager).getAnomalyDetector(eq(detectorId), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(eq(detectorId), eq(AnalysisType.AD), any(ActionListener.class)); doAnswer(invocation -> { MultiGetItemResponse[] items = new MultiGetItemResponse[2]; items[0] = new MultiGetItemResponse( new GetResponse( - new GetResult(CommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), 1, 1, 0, true, null, null, null) + new GetResult(ADCommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), 1, 1, 0, true, null, null, null) ), null ); items[1] = new MultiGetItemResponse( new GetResponse( - new GetResult(CommonName.CHECKPOINT_INDEX_NAME, entity4.getModelId(detectorId2).get(), 1, 1, 0, true, null, null, null) + new GetResult( + ADCommonName.CHECKPOINT_INDEX_NAME, + entity4.getModelId(detectorId2).get(), + 1, + 1, + 0, + true, + null, + null, + null + ) ), null ); @@ -781,7 +793,7 @@ public void testFailToScore() { MultiGetItemResponse[] items = new MultiGetItemResponse[1]; items[0] = new MultiGetItemResponse( new GetResponse( - new GetResult(CommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), 1, 1, 0, true, null, null, null) + new GetResult(ADCommonName.CHECKPOINT_INDEX_NAME, entity.getModelId(detectorId).get(), 1, 1, 0, true, null, null, null) ), null ); diff --git a/src/test/java/org/opensearch/ad/ratelimit/CheckpointWriteWorkerTests.java b/src/test/java/org/opensearch/ad/ratelimit/CheckpointWriteWorkerTests.java index fdcd73410..be83484ee 100644 --- a/src/test/java/org/opensearch/ad/ratelimit/CheckpointWriteWorkerTests.java +++ b/src/test/java/org/opensearch/ad/ratelimit/CheckpointWriteWorkerTests.java @@ -21,7 +21,7 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_BATCH_SIZE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE; import java.io.IOException; import java.time.Instant; @@ -45,9 +45,7 @@ import org.opensearch.action.bulk.BulkItemResponse.Failure; import org.opensearch.action.bulk.BulkResponse; import org.opensearch.action.index.IndexResponse; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.breaker.ADCircuitBreakerService; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.ml.CheckpointDao; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelState; @@ -63,6 +61,11 @@ import org.opensearch.core.rest.RestStatus; import org.opensearch.index.engine.VersionConflictEngineException; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import test.org.opensearch.ad.util.MLUtil; import test.org.opensearch.ad.util.RandomModelStateConfig; @@ -87,9 +90,9 @@ public void setUp() throws Exception { new HashSet<>( Arrays .asList( - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_CONCURRENCY, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_BATCH_SIZE + AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_CONCURRENCY, + AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE ) ) ) @@ -98,31 +101,31 @@ public void setUp() throws Exception { checkpoint = mock(CheckpointDao.class); Map checkpointMap = new HashMap<>(); - checkpointMap.put(CheckpointDao.FIELD_MODEL, "a"); + checkpointMap.put(CommonName.FIELD_MODEL, "a"); when(checkpoint.toIndexSource(any())).thenReturn(checkpointMap); when(checkpoint.shouldSave(any(), anyBoolean(), any(), any())).thenReturn(true); // Integer.MAX_VALUE makes a huge heap worker = new CheckpointWriteWorker( Integer.MAX_VALUE, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_SIZE_IN_BYTES, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT, + TimeSeriesSettings.CHECKPOINT_WRITE_QUEUE_SIZE_IN_BYTES, + AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT, clusterService, new Random(42), - mock(ADCircuitBreakerService.class), + mock(CircuitBreakerService.class), threadPool, Settings.EMPTY, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, clock, - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, - AnomalyDetectorSettings.QUEUE_MAINTENANCE, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.QUEUE_MAINTENANCE, checkpoint, - CommonName.CHECKPOINT_INDEX_NAME, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + ADCommonName.CHECKPOINT_INDEX_NAME, + TimeSeriesSettings.HOURLY_MAINTENANCE, nodeStateManager, - AnomalyDetectorSettings.HOURLY_MAINTENANCE + TimeSeriesSettings.HOURLY_MAINTENANCE ); state = MLUtil.randomModelState(new RandomModelStateConfig.Builder().build()); @@ -181,7 +184,7 @@ public void testTriggerAutoFlush() throws InterruptedException { ExecutorService executorService = mock(ExecutorService.class); ThreadPool mockThreadPool = mock(ThreadPool.class); - when(mockThreadPool.executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); + when(mockThreadPool.executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); doAnswer(invocation -> { Runnable runnable = () -> { try { @@ -209,24 +212,24 @@ public void testTriggerAutoFlush() throws InterruptedException { // create a worker to use mockThreadPool worker = new CheckpointWriteWorker( Integer.MAX_VALUE, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_SIZE_IN_BYTES, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT, + TimeSeriesSettings.CHECKPOINT_WRITE_QUEUE_SIZE_IN_BYTES, + AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT, clusterService, new Random(42), - mock(ADCircuitBreakerService.class), + mock(CircuitBreakerService.class), mockThreadPool, Settings.EMPTY, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, clock, - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, - AnomalyDetectorSettings.QUEUE_MAINTENANCE, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.QUEUE_MAINTENANCE, checkpoint, - CommonName.CHECKPOINT_INDEX_NAME, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + ADCommonName.CHECKPOINT_INDEX_NAME, + TimeSeriesSettings.HOURLY_MAINTENANCE, nodeStateManager, - AnomalyDetectorSettings.HOURLY_MAINTENANCE + TimeSeriesSettings.HOURLY_MAINTENANCE ); // our concurrency is 2, so first 2 requests cause two batches. And the @@ -234,7 +237,7 @@ public void testTriggerAutoFlush() throws InterruptedException { // first 2 batch account for one checkpoint.batchWrite; the remaining one // calls checkpoint.batchWrite // CHECKPOINT_WRITE_QUEUE_BATCH_SIZE is the largest batch size - int numberOfRequests = 2 * CHECKPOINT_WRITE_QUEUE_BATCH_SIZE.getDefault(Settings.EMPTY) + 1; + int numberOfRequests = 2 * AD_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE.getDefault(Settings.EMPTY) + 1; for (int i = 0; i < numberOfRequests; i++) { ModelState state = MLUtil.randomModelState(new RandomModelStateConfig.Builder().build()); worker.write(state, true, RequestPriority.MEDIUM); @@ -265,7 +268,7 @@ public void testOverloaded() { worker.write(state, true, RequestPriority.MEDIUM); verify(checkpoint, times(1)).batchWrite(any(), any()); - verify(nodeStateManager, times(1)).setException(eq(state.getDetectorId()), any(OpenSearchRejectedExecutionException.class)); + verify(nodeStateManager, times(1)).setException(eq(state.getId()), any(OpenSearchRejectedExecutionException.class)); } public void testRetryException() { @@ -279,7 +282,7 @@ public void testRetryException() { worker.write(state, true, RequestPriority.MEDIUM); // we don't retry checkpoint write verify(checkpoint, times(1)).batchWrite(any(), any()); - verify(nodeStateManager, times(1)).setException(eq(state.getDetectorId()), any(OpenSearchStatusException.class)); + verify(nodeStateManager, times(1)).setException(eq(state.getId()), any(OpenSearchStatusException.class)); } /** @@ -352,7 +355,7 @@ public void testEmptyModelId() { when(state.getLastCheckpointTime()).thenReturn(Instant.now()); EntityModel model = mock(EntityModel.class); when(state.getModel()).thenReturn(model); - when(state.getDetectorId()).thenReturn("1"); + when(state.getId()).thenReturn("1"); when(state.getModelId()).thenReturn(null); worker.write(state, true, RequestPriority.MEDIUM); @@ -365,7 +368,7 @@ public void testEmptyDetectorId() { when(state.getLastCheckpointTime()).thenReturn(Instant.now()); EntityModel model = mock(EntityModel.class); when(state.getModel()).thenReturn(model); - when(state.getDetectorId()).thenReturn(null); + when(state.getId()).thenReturn(null); when(state.getModelId()).thenReturn("a"); worker.write(state, true, RequestPriority.MEDIUM); @@ -375,10 +378,10 @@ public void testEmptyDetectorId() { @SuppressWarnings("unchecked") public void testDetectorNotAvailableSingleWrite() { doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.empty()); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); worker.write(state, true, RequestPriority.MEDIUM); verify(checkpoint, never()).batchWrite(any(), any()); @@ -387,10 +390,10 @@ public void testDetectorNotAvailableSingleWrite() { @SuppressWarnings("unchecked") public void testDetectorNotAvailableWriteAll() { doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.empty()); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); List> states = new ArrayList<>(); states.add(state); @@ -401,10 +404,10 @@ public void testDetectorNotAvailableWriteAll() { @SuppressWarnings("unchecked") public void testDetectorFetchException() { doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onFailure(new RuntimeException()); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); worker.write(state, true, RequestPriority.MEDIUM); verify(checkpoint, never()).batchWrite(any(), any()); diff --git a/src/test/java/org/opensearch/ad/ratelimit/ColdEntityWorkerTests.java b/src/test/java/org/opensearch/ad/ratelimit/ColdEntityWorkerTests.java index 47c35d625..d093f20ae 100644 --- a/src/test/java/org/opensearch/ad/ratelimit/ColdEntityWorkerTests.java +++ b/src/test/java/org/opensearch/ad/ratelimit/ColdEntityWorkerTests.java @@ -27,12 +27,13 @@ import java.util.List; import java.util.Random; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.ClusterSettings; import org.opensearch.common.settings.Settings; import org.opensearch.common.unit.TimeValue; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.settings.TimeSeriesSettings; public class ColdEntityWorkerTests extends AbstractRateLimitingTest { ClusterService clusterService; @@ -45,7 +46,7 @@ public class ColdEntityWorkerTests extends AbstractRateLimitingTest { public void setUp() throws Exception { super.setUp(); clusterService = mock(ClusterService.class); - Settings settings = Settings.builder().put(AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_BATCH_SIZE.getKey(), 1).build(); + Settings settings = Settings.builder().put(AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_BATCH_SIZE.getKey(), 1).build(); ClusterSettings clusterSettings = new ClusterSettings( settings, Collections @@ -53,9 +54,9 @@ public void setUp() throws Exception { new HashSet<>( Arrays .asList( - AnomalyDetectorSettings.EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS, - AnomalyDetectorSettings.COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_BATCH_SIZE + AnomalyDetectorSettings.AD_EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS, + AnomalyDetectorSettings.AD_COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_BATCH_SIZE ) ) ) @@ -68,19 +69,19 @@ public void setUp() throws Exception { coldWorker = new ColdEntityWorker( Integer.MAX_VALUE, AnomalyDetectorSettings.ENTITY_FEATURE_REQUEST_SIZE_IN_BYTES, - AnomalyDetectorSettings.COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, clusterService, new Random(42), - mock(ADCircuitBreakerService.class), + mock(CircuitBreakerService.class), threadPool, settings, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, clock, - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, readWorker, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, nodeStateManager ); @@ -99,7 +100,7 @@ public void setUp() throws Exception { TimeValue value = invocation.getArgument(1); // since we have only 1 request each time - long expectedExecutionPerRequestMilli = AnomalyDetectorSettings.EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS + long expectedExecutionPerRequestMilli = AnomalyDetectorSettings.AD_EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS .getDefault(Settings.EMPTY); long delay = value.getMillis(); assertTrue(delay == expectedExecutionPerRequestMilli); @@ -143,9 +144,9 @@ public void testDelay() { new HashSet<>( Arrays .asList( - AnomalyDetectorSettings.EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS, - AnomalyDetectorSettings.COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_BATCH_SIZE + AnomalyDetectorSettings.AD_EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS, + AnomalyDetectorSettings.AD_COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_BATCH_SIZE ) ) ) @@ -156,19 +157,19 @@ public void testDelay() { coldWorker = new ColdEntityWorker( Integer.MAX_VALUE, AnomalyDetectorSettings.ENTITY_FEATURE_REQUEST_SIZE_IN_BYTES, - AnomalyDetectorSettings.COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, clusterService, new Random(42), - mock(ADCircuitBreakerService.class), + mock(CircuitBreakerService.class), threadPool, Settings.EMPTY, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, clock, - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, readWorker, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, nodeStateManager ); diff --git a/src/test/java/org/opensearch/ad/ratelimit/EntityColdStartWorkerTests.java b/src/test/java/org/opensearch/ad/ratelimit/EntityColdStartWorkerTests.java index 4b123b2e3..9fdf5a396 100644 --- a/src/test/java/org/opensearch/ad/ratelimit/EntityColdStartWorkerTests.java +++ b/src/test/java/org/opensearch/ad/ratelimit/EntityColdStartWorkerTests.java @@ -28,7 +28,6 @@ import java.util.Random; import org.opensearch.OpenSearchStatusException; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.ml.EntityColdStarter; import org.opensearch.ad.ml.EntityModel; @@ -40,6 +39,8 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.core.concurrency.OpenSearchRejectedExecutionException; import org.opensearch.core.rest.RestStatus; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import test.org.opensearch.ad.util.MLUtil; @@ -60,8 +61,8 @@ public void setUp() throws Exception { new HashSet<>( Arrays .asList( - AnomalyDetectorSettings.ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.ENTITY_COLD_START_QUEUE_CONCURRENCY + AnomalyDetectorSettings.AD_ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_ENTITY_COLD_START_QUEUE_CONCURRENCY ) ) ) @@ -76,20 +77,20 @@ public void setUp() throws Exception { worker = new EntityColdStartWorker( Integer.MAX_VALUE, AnomalyDetectorSettings.ENTITY_REQUEST_SIZE_IN_BYTES, - AnomalyDetectorSettings.ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT, clusterService, new Random(42), - mock(ADCircuitBreakerService.class), + mock(CircuitBreakerService.class), threadPool, Settings.EMPTY, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, clock, - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, - AnomalyDetectorSettings.QUEUE_MAINTENANCE, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.QUEUE_MAINTENANCE, entityColdStarter, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, + TimeSeriesSettings.HOURLY_MAINTENANCE, nodeStateManager, cacheProvider ); diff --git a/src/test/java/org/opensearch/ad/ratelimit/ResultWriteWorkerTests.java b/src/test/java/org/opensearch/ad/ratelimit/ResultWriteWorkerTests.java index ab3f1a1f4..304a942c7 100644 --- a/src/test/java/org/opensearch/ad/ratelimit/ResultWriteWorkerTests.java +++ b/src/test/java/org/opensearch/ad/ratelimit/ResultWriteWorkerTests.java @@ -33,15 +33,12 @@ import org.opensearch.OpenSearchStatusException; import org.opensearch.action.index.IndexRequest; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.breaker.ADCircuitBreakerService; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.transport.ADResultBulkRequest; import org.opensearch.ad.transport.ADResultBulkResponse; import org.opensearch.ad.transport.handler.MultiEntityResultHandler; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.ClusterSettings; import org.opensearch.common.settings.Settings; @@ -50,6 +47,10 @@ import org.opensearch.core.rest.RestStatus; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.RestHandlerUtils; public class ResultWriteWorkerTests extends AbstractRateLimitingTest { ResultWriteWorker resultWriteQueue; @@ -69,9 +70,9 @@ public void setUp() throws Exception { new HashSet<>( Arrays .asList( - AnomalyDetectorSettings.RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.RESULT_WRITE_QUEUE_CONCURRENCY, - AnomalyDetectorSettings.RESULT_WRITE_QUEUE_BATCH_SIZE + AnomalyDetectorSettings.AD_RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_RESULT_WRITE_QUEUE_CONCURRENCY, + AnomalyDetectorSettings.AD_RESULT_WRITE_QUEUE_BATCH_SIZE ) ) ) @@ -85,23 +86,23 @@ public void setUp() throws Exception { resultWriteQueue = new ResultWriteWorker( Integer.MAX_VALUE, - AnomalyDetectorSettings.RESULT_WRITE_QUEUE_SIZE_IN_BYTES, - AnomalyDetectorSettings.RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT, + TimeSeriesSettings.RESULT_WRITE_QUEUE_SIZE_IN_BYTES, + AnomalyDetectorSettings.AD_RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT, clusterService, new Random(42), - mock(ADCircuitBreakerService.class), + mock(CircuitBreakerService.class), threadPool, Settings.EMPTY, - AnomalyDetectorSettings.MAX_QUEUED_TASKS_RATIO, + TimeSeriesSettings.MAX_QUEUED_TASKS_RATIO, clock, - AnomalyDetectorSettings.MEDIUM_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.LOW_SEGMENT_PRUNE_RATIO, - AnomalyDetectorSettings.MAINTENANCE_FREQ_CONSTANT, - AnomalyDetectorSettings.QUEUE_MAINTENANCE, + TimeSeriesSettings.MEDIUM_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.LOW_SEGMENT_PRUNE_RATIO, + TimeSeriesSettings.MAINTENANCE_FREQ_CONSTANT, + TimeSeriesSettings.QUEUE_MAINTENANCE, resultHandler, xContentRegistry(), nodeStateManager, - AnomalyDetectorSettings.HOURLY_MAINTENANCE + TimeSeriesSettings.HOURLY_MAINTENANCE ); detectResult = TestHelpers.randomHCADAnomalyDetectResult(0.8, Double.NaN, null); @@ -137,7 +138,7 @@ public void testRegular() { public void testSingleRetryRequest() throws IOException { List retryRequests = new ArrayList<>(); try (XContentBuilder builder = jsonBuilder()) { - IndexRequest indexRequest = new IndexRequest(CommonName.ANOMALY_RESULT_INDEX_ALIAS) + IndexRequest indexRequest = new IndexRequest(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS) .source(detectResult.toXContent(builder, RestHandlerUtils.XCONTENT_WITH_TYPE)); retryRequests.add(indexRequest); } diff --git a/src/test/java/org/opensearch/ad/rest/ADRestTestUtils.java b/src/test/java/org/opensearch/ad/rest/ADRestTestUtils.java index 9486e7ec8..6b9d01f56 100644 --- a/src/test/java/org/opensearch/ad/rest/ADRestTestUtils.java +++ b/src/test/java/org/opensearch/ad/rest/ADRestTestUtils.java @@ -9,15 +9,15 @@ package org.opensearch.ad.rest; import static com.carrotsearch.randomizedtesting.RandomizedTest.randomBoolean; -import static org.opensearch.ad.util.RestHandlerUtils.ANOMALY_DETECTOR_JOB; -import static org.opensearch.ad.util.RestHandlerUtils.HISTORICAL_ANALYSIS_TASK; -import static org.opensearch.ad.util.RestHandlerUtils.REALTIME_TASK; import static org.opensearch.test.OpenSearchTestCase.randomAlphaOfLength; import static org.opensearch.test.OpenSearchTestCase.randomDoubleBetween; import static org.opensearch.test.OpenSearchTestCase.randomInt; import static org.opensearch.test.OpenSearchTestCase.randomIntBetween; import static org.opensearch.test.OpenSearchTestCase.randomLong; import static org.opensearch.test.rest.OpenSearchRestTestCase.entityAsMap; +import static org.opensearch.timeseries.util.RestHandlerUtils.ANOMALY_DETECTOR_JOB; +import static org.opensearch.timeseries.util.RestHandlerUtils.HISTORICAL_ANALYSIS_TASK; +import static org.opensearch.timeseries.util.RestHandlerUtils.REALTIME_TASK; import java.io.IOException; import java.time.Instant; @@ -35,17 +35,17 @@ import org.apache.http.util.EntityUtils; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.core.Logger; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.mock.model.MockSimpleLog; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.ADTaskProfile; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.model.DetectionDateRange; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.client.Response; import org.opensearch.client.RestClient; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.Job; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -209,11 +209,12 @@ public static Response createAnomalyDetector( now, categoryFields, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ); if (historical) { - detector.setDetectionDateRange(new DetectionDateRange(now.minus(30, ChronoUnit.DAYS), now)); + detector.setDetectionDateRange(new DateRange(now.minus(30, ChronoUnit.DAYS), now)); } return TestHelpers @@ -268,7 +269,7 @@ public static List searchLatestAdTaskOfDetector(RestClient client, Strin .builder() .taskId(id) .state(state) - .detectorId(parsedDetectorId) + .configId(parsedDetectorId) .taskProgress(taskProgress.floatValue()) .initProgress(initProgress.floatValue()) .taskType(parsedTaskType) @@ -349,12 +350,12 @@ public static Map getDetectorWithJobAndTask(RestClient client, S Map jobMap = (Map) responseMap.get(ANOMALY_DETECTOR_JOB); if (jobMap != null) { - String jobName = (String) jobMap.get(AnomalyDetectorJob.NAME_FIELD); - boolean enabled = (boolean) jobMap.get(AnomalyDetectorJob.IS_ENABLED_FIELD); - long enabledTime = (long) jobMap.get(AnomalyDetectorJob.ENABLED_TIME_FIELD); - long lastUpdateTime = (long) jobMap.get(AnomalyDetectorJob.LAST_UPDATE_TIME_FIELD); + String jobName = (String) jobMap.get(Job.NAME_FIELD); + boolean enabled = (boolean) jobMap.get(Job.IS_ENABLED_FIELD); + long enabledTime = (long) jobMap.get(Job.ENABLED_TIME_FIELD); + long lastUpdateTime = (long) jobMap.get(Job.LAST_UPDATE_TIME_FIELD); - AnomalyDetectorJob job = new AnomalyDetectorJob( + Job job = new Job( jobName, null, null, @@ -396,7 +397,7 @@ private static ADTask parseAdTask(Map taskMap) { .builder() .taskId(id) .state(state) - .detectorId(parsedDetectorId) + .configId(parsedDetectorId) .taskProgress(taskProgress.floatValue()) .initProgress(initProgress.floatValue()) .taskType(parsedTaskType) @@ -446,7 +447,7 @@ public static String startAnomalyDetectorDirectly(RestClient client, String dete @SuppressWarnings("unchecked") public static String startHistoricalAnalysis(RestClient client, String detectorId) throws IOException { Instant now = Instant.now(); - DetectionDateRange dateRange = new DetectionDateRange(now.minus(30, ChronoUnit.DAYS), now); + DateRange dateRange = new DateRange(now.minus(30, ChronoUnit.DAYS), now); Response response = TestHelpers .makeRequest( client, diff --git a/src/test/java/org/opensearch/ad/rest/AnomalyDetectorRestApiIT.java b/src/test/java/org/opensearch/ad/rest/AnomalyDetectorRestApiIT.java index 7314144a6..990f0f0c7 100644 --- a/src/test/java/org/opensearch/ad/rest/AnomalyDetectorRestApiIT.java +++ b/src/test/java/org/opensearch/ad/rest/AnomalyDetectorRestApiIT.java @@ -12,9 +12,9 @@ package org.opensearch.ad.rest; import static org.hamcrest.Matchers.containsString; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_FIND_DETECTOR_MSG; import static org.opensearch.ad.rest.handler.AbstractAnomalyDetectorActionHandler.DUPLICATE_DETECTOR_MSG; import static org.opensearch.ad.rest.handler.AbstractAnomalyDetectorActionHandler.NO_DOCS_IN_USER_INDEX_MSG; +import static org.opensearch.timeseries.constant.CommonMessages.FAIL_TO_FIND_CONFIG_MSG; import java.io.IOException; import java.time.Instant; @@ -28,22 +28,17 @@ import java.util.stream.Collectors; import org.apache.http.entity.ContentType; -import org.apache.http.nio.entity.NStringEntity; +import org.apache.http.entity.StringEntity; +import org.hamcrest.CoreMatchers; import org.junit.Assert; -import org.opensearch.ad.AnomalyDetectorPlugin; import org.opensearch.ad.AnomalyDetectorRestTestCase; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyDetectorExecutionInput; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.model.DetectionDateRange; -import org.opensearch.ad.model.Feature; import org.opensearch.ad.rest.handler.AbstractAnomalyDetectorActionHandler; -import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.settings.EnabledSetting; +import org.opensearch.ad.settings.ADEnabledSetting; import org.opensearch.client.Response; import org.opensearch.client.ResponseException; import org.opensearch.common.UUIDs; @@ -52,6 +47,14 @@ import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.index.query.QueryBuilders; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -135,13 +138,14 @@ public void testCreateAnomalyDetectorWithDuplicateName() throws Exception { TestHelpers.randomQuery(), TestHelpers.randomIntervalTimeConfiguration(), TestHelpers.randomIntervalTimeConfiguration(), - randomIntBetween(1, AnomalyDetectorSettings.MAX_SHINGLE_SIZE), + randomIntBetween(1, TimeSeriesSettings.MAX_SHINGLE_SIZE), TestHelpers.randomUiMetadata(), randomInt(), null, null, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ); TestHelpers @@ -162,7 +166,7 @@ public void testCreateAnomalyDetectorWithDuplicateName() throws Exception { public void testCreateAnomalyDetector() throws Exception { AnomalyDetector detector = createIndexAndGetAnomalyDetector(INDEX_NAME); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, false); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, false); Exception ex = expectThrows( ResponseException.class, @@ -176,9 +180,9 @@ public void testCreateAnomalyDetector() throws Exception { null ) ); - assertThat(ex.getMessage(), containsString(CommonErrorMessages.DISABLED_ERR_MSG)); + assertThat(ex.getMessage(), containsString(ADCommonMessages.DISABLED_ERR_MSG)); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, true); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, true); Response response = TestHelpers .makeRequest(client(), "POST", TestHelpers.AD_BASE_DETECTORS_URI, ImmutableMap.of(), TestHelpers.toHttpEntity(detector), null); assertEquals("Create anomaly detector failed", RestStatus.CREATED, TestHelpers.restStatus(response)); @@ -205,7 +209,7 @@ public void testUpdateAnomalyDetectorCategoryField() throws Exception { detector.getIndices(), detector.getFeatureAttributes(), detector.getFilterQuery(), - detector.getDetectionInterval(), + detector.getInterval(), detector.getWindowDelay(), detector.getShingleSize(), detector.getUiMetadata(), @@ -213,7 +217,8 @@ public void testUpdateAnomalyDetectorCategoryField() throws Exception { detector.getLastUpdateTime(), ImmutableList.of(randomAlphaOfLength(5)), detector.getUser(), - null + null, + TestHelpers.randomImputationOption() ); Exception ex = expectThrows( ResponseException.class, @@ -227,33 +232,33 @@ public void testUpdateAnomalyDetectorCategoryField() throws Exception { null ) ); - assertThat(ex.getMessage(), containsString(CommonErrorMessages.CAN_NOT_CHANGE_CATEGORY_FIELD)); + assertThat(ex.getMessage(), containsString(CommonMessages.CAN_NOT_CHANGE_CATEGORY_FIELD)); } public void testGetAnomalyDetector() throws Exception { AnomalyDetector detector = createRandomAnomalyDetector(true, true, client()); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, false); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, false); - Exception ex = expectThrows(ResponseException.class, () -> getAnomalyDetector(detector.getDetectorId(), client())); - assertThat(ex.getMessage(), containsString(CommonErrorMessages.DISABLED_ERR_MSG)); + Exception ex = expectThrows(ResponseException.class, () -> getConfig(detector.getId(), client())); + assertThat(ex.getMessage(), containsString(ADCommonMessages.DISABLED_ERR_MSG)); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, true); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, true); - AnomalyDetector createdDetector = getAnomalyDetector(detector.getDetectorId(), client()); + AnomalyDetector createdDetector = getConfig(detector.getId(), client()); assertEquals("Incorrect Location header", detector, createdDetector); } public void testGetNotExistingAnomalyDetector() throws Exception { createRandomAnomalyDetector(true, true, client()); - TestHelpers.assertFailWith(ResponseException.class, null, () -> getAnomalyDetector(randomAlphaOfLength(5), client())); + TestHelpers.assertFailWith(ResponseException.class, null, () -> getConfig(randomAlphaOfLength(5), client())); } public void testUpdateAnomalyDetector() throws Exception { AnomalyDetector detector = createAnomalyDetector(createIndexAndGetAnomalyDetector(INDEX_NAME), true, client()); String newDescription = randomAlphaOfLength(5); AnomalyDetector newDetector = new AnomalyDetector( - detector.getDetectorId(), + detector.getId(), detector.getVersion(), detector.getName(), newDescription, @@ -261,7 +266,7 @@ public void testUpdateAnomalyDetector() throws Exception { detector.getIndices(), detector.getFeatureAttributes(), detector.getFilterQuery(), - detector.getDetectionInterval(), + detector.getInterval(), detector.getWindowDelay(), detector.getShingleSize(), detector.getUiMetadata(), @@ -269,10 +274,11 @@ public void testUpdateAnomalyDetector() throws Exception { detector.getLastUpdateTime(), null, detector.getUser(), - null + null, + TestHelpers.randomImputationOption() ); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, false); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, false); Exception ex = expectThrows( ResponseException.class, @@ -280,21 +286,21 @@ public void testUpdateAnomalyDetector() throws Exception { .makeRequest( client(), "PUT", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "?refresh=true", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "?refresh=true", ImmutableMap.of(), TestHelpers.toHttpEntity(newDetector), null ) ); - assertThat(ex.getMessage(), containsString(CommonErrorMessages.DISABLED_ERR_MSG)); + assertThat(ex.getMessage(), containsString(ADCommonMessages.DISABLED_ERR_MSG)); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, true); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, true); Response updateResponse = TestHelpers .makeRequest( client(), "PUT", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "?refresh=true", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "?refresh=true", ImmutableMap.of(), TestHelpers.toHttpEntity(newDetector), null @@ -302,10 +308,10 @@ public void testUpdateAnomalyDetector() throws Exception { assertEquals("Update anomaly detector failed", RestStatus.OK, TestHelpers.restStatus(updateResponse)); Map responseBody = entityAsMap(updateResponse); - assertEquals("Updated anomaly detector id doesn't match", detector.getDetectorId(), responseBody.get("_id")); + assertEquals("Updated anomaly detector id doesn't match", detector.getId(), responseBody.get("_id")); assertEquals("Version not incremented", (detector.getVersion().intValue() + 1), (int) responseBody.get("_version")); - AnomalyDetector updatedDetector = getAnomalyDetector(detector.getDetectorId(), client()); + AnomalyDetector updatedDetector = getConfig(detector.getId(), client()); assertNotEquals("Anomaly detector last update time not changed", updatedDetector.getLastUpdateTime(), detector.getLastUpdateTime()); assertEquals("Anomaly detector description not updated", newDescription, updatedDetector.getDescription()); } @@ -314,7 +320,7 @@ public void testUpdateAnomalyDetectorNameToExisting() throws Exception { AnomalyDetector detector1 = createIndexAndGetAnomalyDetector("index-test-one"); AnomalyDetector detector2 = createIndexAndGetAnomalyDetector("index-test-two"); AnomalyDetector newDetector1WithDetector2Name = new AnomalyDetector( - detector1.getDetectorId(), + detector1.getId(), detector1.getVersion(), detector2.getName(), detector1.getDescription(), @@ -322,7 +328,7 @@ public void testUpdateAnomalyDetectorNameToExisting() throws Exception { detector1.getIndices(), detector1.getFeatureAttributes(), detector1.getFilterQuery(), - detector1.getDetectionInterval(), + detector1.getInterval(), detector1.getWindowDelay(), detector1.getShingleSize(), detector1.getUiMetadata(), @@ -330,7 +336,8 @@ public void testUpdateAnomalyDetectorNameToExisting() throws Exception { detector1.getLastUpdateTime(), null, detector1.getUser(), - null + null, + TestHelpers.randomImputationOption() ); TestHelpers @@ -352,7 +359,7 @@ public void testUpdateAnomalyDetectorNameToExisting() throws Exception { public void testUpdateAnomalyDetectorNameToNew() throws Exception { AnomalyDetector detector = createAnomalyDetector(createIndexAndGetAnomalyDetector(INDEX_NAME), true, client()); AnomalyDetector detectorWithNewName = new AnomalyDetector( - detector.getDetectorId(), + detector.getId(), detector.getVersion(), randomAlphaOfLength(5), detector.getDescription(), @@ -360,7 +367,7 @@ public void testUpdateAnomalyDetectorNameToNew() throws Exception { detector.getIndices(), detector.getFeatureAttributes(), detector.getFilterQuery(), - detector.getDetectionInterval(), + detector.getInterval(), detector.getWindowDelay(), detector.getShingleSize(), detector.getUiMetadata(), @@ -368,22 +375,23 @@ public void testUpdateAnomalyDetectorNameToNew() throws Exception { Instant.now(), null, detector.getUser(), - null + null, + TestHelpers.randomImputationOption() ); TestHelpers .makeRequest( client(), "PUT", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "?refresh=true", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "?refresh=true", ImmutableMap.of(), TestHelpers.toHttpEntity(detectorWithNewName), null ); - AnomalyDetector resultDetector = getAnomalyDetector(detectorWithNewName.getDetectorId(), client()); + AnomalyDetector resultDetector = getConfig(detectorWithNewName.getId(), client()); assertEquals("Detector name updating failed", detectorWithNewName.getName(), resultDetector.getName()); - assertEquals("Updated anomaly detector id doesn't match", detectorWithNewName.getDetectorId(), resultDetector.getDetectorId()); + assertEquals("Updated anomaly detector id doesn't match", detectorWithNewName.getId(), resultDetector.getId()); assertNotEquals( "Anomaly detector last update time not changed", detectorWithNewName.getLastUpdateTime(), @@ -397,7 +405,7 @@ public void testUpdateAnomalyDetectorWithNotExistingIndex() throws Exception { String newDescription = randomAlphaOfLength(5); AnomalyDetector newDetector = new AnomalyDetector( - detector.getDetectorId(), + detector.getId(), detector.getVersion(), detector.getName(), newDescription, @@ -405,7 +413,7 @@ public void testUpdateAnomalyDetectorWithNotExistingIndex() throws Exception { detector.getIndices(), detector.getFeatureAttributes(), detector.getFilterQuery(), - detector.getDetectionInterval(), + detector.getInterval(), detector.getWindowDelay(), detector.getShingleSize(), detector.getUiMetadata(), @@ -413,10 +421,11 @@ public void testUpdateAnomalyDetectorWithNotExistingIndex() throws Exception { detector.getLastUpdateTime(), null, detector.getUser(), - null + null, + TestHelpers.randomImputationOption() ); - deleteIndexWithAdminClient(AnomalyDetector.ANOMALY_DETECTORS_INDEX); + deleteIndexWithAdminClient(CommonName.CONFIG_INDEX); TestHelpers .assertFailWith( @@ -426,7 +435,7 @@ public void testUpdateAnomalyDetectorWithNotExistingIndex() throws Exception { .makeRequest( client(), "PUT", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId(), + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId(), ImmutableMap.of(), TestHelpers.toHttpEntity(newDetector), null @@ -436,9 +445,9 @@ public void testUpdateAnomalyDetectorWithNotExistingIndex() throws Exception { public void testSearchAnomalyDetector() throws Exception { AnomalyDetector detector = createRandomAnomalyDetector(true, true, client()); - SearchSourceBuilder search = (new SearchSourceBuilder()).query(QueryBuilders.termQuery("_id", detector.getDetectorId())); + SearchSourceBuilder search = (new SearchSourceBuilder()).query(QueryBuilders.termQuery("_id", detector.getId())); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, false); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, false); Exception ex = expectThrows( ResponseException.class, @@ -448,13 +457,13 @@ public void testSearchAnomalyDetector() throws Exception { "GET", TestHelpers.AD_BASE_DETECTORS_URI + "/_search", ImmutableMap.of(), - new NStringEntity(search.toString(), ContentType.APPLICATION_JSON), + new StringEntity(search.toString(), ContentType.APPLICATION_JSON), null ) ); - assertThat(ex.getMessage(), containsString(CommonErrorMessages.DISABLED_ERR_MSG)); + assertThat(ex.getMessage(), containsString(ADCommonMessages.DISABLED_ERR_MSG)); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, true); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, true); Response searchResponse = TestHelpers .makeRequest( @@ -462,24 +471,24 @@ public void testSearchAnomalyDetector() throws Exception { "GET", TestHelpers.AD_BASE_DETECTORS_URI + "/_search", ImmutableMap.of(), - new NStringEntity(search.toString(), ContentType.APPLICATION_JSON), + new StringEntity(search.toString(), ContentType.APPLICATION_JSON), null ); assertEquals("Search anomaly detector failed", RestStatus.OK, TestHelpers.restStatus(searchResponse)); } public void testStatsAnomalyDetector() throws Exception { - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, false); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, false); Exception ex = expectThrows( ResponseException.class, - () -> TestHelpers.makeRequest(client(), "GET", AnomalyDetectorPlugin.LEGACY_AD_BASE + "/stats", ImmutableMap.of(), "", null) + () -> TestHelpers.makeRequest(client(), "GET", TimeSeriesAnalyticsPlugin.LEGACY_AD_BASE + "/stats", ImmutableMap.of(), "", null) ); - assertThat(ex.getMessage(), containsString(CommonErrorMessages.DISABLED_ERR_MSG)); + assertThat(ex.getMessage(), containsString(ADCommonMessages.DISABLED_ERR_MSG)); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, true); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, true); Response statsResponse = TestHelpers - .makeRequest(client(), "GET", AnomalyDetectorPlugin.LEGACY_AD_BASE + "/stats", ImmutableMap.of(), "", null); + .makeRequest(client(), "GET", TimeSeriesAnalyticsPlugin.LEGACY_AD_BASE + "/stats", ImmutableMap.of(), "", null); assertEquals("Get stats failed", RestStatus.OK, TestHelpers.restStatus(statsResponse)); } @@ -487,13 +496,13 @@ public void testStatsAnomalyDetector() throws Exception { public void testPreviewAnomalyDetector() throws Exception { AnomalyDetector detector = createRandomAnomalyDetector(true, false, client()); AnomalyDetectorExecutionInput input = new AnomalyDetectorExecutionInput( - detector.getDetectorId(), + detector.getId(), Instant.now().minusSeconds(60 * 10), Instant.now(), null ); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, false); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, false); Exception ex = expectThrows( ResponseException.class, @@ -507,9 +516,9 @@ public void testPreviewAnomalyDetector() throws Exception { null ) ); - assertThat(ex.getMessage(), containsString(CommonErrorMessages.DISABLED_ERR_MSG)); + assertThat(ex.getMessage(), containsString(ADCommonMessages.DISABLED_ERR_MSG)); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, true); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, true); Response response = TestHelpers .makeRequest( @@ -571,7 +580,7 @@ public void testExecuteAnomalyDetectorWithNullDetectorId() throws Exception { public void testPreviewAnomalyDetectorWithDetector() throws Exception { AnomalyDetector detector = createRandomAnomalyDetector(true, true, client()); AnomalyDetectorExecutionInput input = new AnomalyDetectorExecutionInput( - detector.getDetectorId(), + detector.getId(), Instant.now().minusSeconds(60 * 10), Instant.now(), detector @@ -592,7 +601,7 @@ public void testPreviewAnomalyDetectorWithDetector() throws Exception { public void testPreviewAnomalyDetectorWithDetectorAndNoFeatures() throws Exception { AnomalyDetector detector = createRandomAnomalyDetector(true, true, client()); AnomalyDetectorExecutionInput input = new AnomalyDetectorExecutionInput( - detector.getDetectorId(), + detector.getId(), Instant.now().minusSeconds(60 * 10), Instant.now(), TestHelpers.randomAnomalyDetectorWithEmptyFeature() @@ -627,10 +636,9 @@ public void testSearchAnomalyResult() throws Exception { ); assertEquals("Post anomaly result failed", RestStatus.CREATED, TestHelpers.restStatus(response)); - SearchSourceBuilder search = (new SearchSourceBuilder()) - .query(QueryBuilders.termQuery("detector_id", anomalyResult.getDetectorId())); + SearchSourceBuilder search = (new SearchSourceBuilder()).query(QueryBuilders.termQuery("detector_id", anomalyResult.getConfigId())); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, false); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, false); Exception ex = expectThrows( ResponseException.class, @@ -640,13 +648,13 @@ public void testSearchAnomalyResult() throws Exception { "POST", TestHelpers.AD_BASE_RESULT_URI + "/_search", ImmutableMap.of(), - new NStringEntity(search.toString(), ContentType.APPLICATION_JSON), + new StringEntity(search.toString(), ContentType.APPLICATION_JSON), null ) ); - assertThat(ex.getMessage(), containsString(CommonErrorMessages.DISABLED_ERR_MSG)); + assertThat(ex.getMessage(), containsString(ADCommonMessages.DISABLED_ERR_MSG)); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, true); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, true); Response searchResponse = TestHelpers .makeRequest( @@ -654,7 +662,7 @@ public void testSearchAnomalyResult() throws Exception { "POST", TestHelpers.AD_BASE_RESULT_URI + "/_search", ImmutableMap.of(), - new NStringEntity(search.toString(), ContentType.APPLICATION_JSON), + new StringEntity(search.toString(), ContentType.APPLICATION_JSON), null ); assertEquals("Search anomaly result failed", RestStatus.OK, TestHelpers.restStatus(searchResponse)); @@ -666,7 +674,7 @@ public void testSearchAnomalyResult() throws Exception { "POST", TestHelpers.AD_BASE_RESULT_URI + "/_search", ImmutableMap.of(), - new NStringEntity(searchAll.toString(), ContentType.APPLICATION_JSON), + new StringEntity(searchAll.toString(), ContentType.APPLICATION_JSON), null ); assertEquals("Search anomaly result failed", RestStatus.OK, TestHelpers.restStatus(searchAllResponse)); @@ -675,32 +683,18 @@ public void testSearchAnomalyResult() throws Exception { public void testDeleteAnomalyDetector() throws Exception { AnomalyDetector detector = createRandomAnomalyDetector(true, false, client()); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, false); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, false); Exception ex = expectThrows( ResponseException.class, () -> TestHelpers - .makeRequest( - client(), - "DELETE", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId(), - ImmutableMap.of(), - "", - null - ) + .makeRequest(client(), "DELETE", TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId(), ImmutableMap.of(), "", null) ); - assertThat(ex.getMessage(), containsString(CommonErrorMessages.DISABLED_ERR_MSG)); + assertThat(ex.getMessage(), containsString(ADCommonMessages.DISABLED_ERR_MSG)); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, true); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, true); Response response = TestHelpers - .makeRequest( - client(), - "DELETE", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId(), - ImmutableMap.of(), - "", - null - ); + .makeRequest(client(), "DELETE", TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId(), ImmutableMap.of(), "", null); assertEquals("Delete anomaly detector failed", RestStatus.OK, TestHelpers.restStatus(response)); } @@ -723,14 +717,7 @@ public void testDeleteAnomalyDetectorWhichNotExist() throws Exception { public void testDeleteAnomalyDetectorWithNoAdJob() throws Exception { AnomalyDetector detector = createRandomAnomalyDetector(true, false, client()); Response response = TestHelpers - .makeRequest( - client(), - "DELETE", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId(), - ImmutableMap.of(), - "", - null - ); + .makeRequest(client(), "DELETE", TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId(), ImmutableMap.of(), "", null); assertEquals("Delete anomaly detector failed", RestStatus.OK, TestHelpers.restStatus(response)); } @@ -740,7 +727,7 @@ public void testDeleteAnomalyDetectorWithRunningAdJob() throws Exception { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_start", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_start", ImmutableMap.of(), "", null @@ -756,7 +743,7 @@ public void testDeleteAnomalyDetectorWithRunningAdJob() throws Exception { .makeRequest( client(), "DELETE", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId(), + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId(), ImmutableMap.of(), "", null @@ -770,7 +757,7 @@ public void testUpdateAnomalyDetectorWithRunningAdJob() throws Exception { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_start", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_start", ImmutableMap.of(), "", null @@ -781,7 +768,7 @@ public void testUpdateAnomalyDetectorWithRunningAdJob() throws Exception { String newDescription = randomAlphaOfLength(5); AnomalyDetector newDetector = new AnomalyDetector( - detector.getDetectorId(), + detector.getId(), detector.getVersion(), detector.getName(), newDescription, @@ -789,7 +776,7 @@ public void testUpdateAnomalyDetectorWithRunningAdJob() throws Exception { detector.getIndices(), detector.getFeatureAttributes(), detector.getFilterQuery(), - detector.getDetectionInterval(), + detector.getInterval(), detector.getWindowDelay(), detector.getShingleSize(), detector.getUiMetadata(), @@ -797,7 +784,8 @@ public void testUpdateAnomalyDetectorWithRunningAdJob() throws Exception { detector.getLastUpdateTime(), null, detector.getUser(), - null + null, + TestHelpers.randomImputationOption() ); TestHelpers @@ -808,7 +796,7 @@ public void testUpdateAnomalyDetectorWithRunningAdJob() throws Exception { .makeRequest( client(), "PUT", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId(), + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId(), ImmutableMap.of(), TestHelpers.toHttpEntity(newDetector), null @@ -822,7 +810,7 @@ public void testGetDetectorWithAdJob() throws Exception { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_start", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_start", ImmutableMap.of(), "", null @@ -830,12 +818,12 @@ public void testGetDetectorWithAdJob() throws Exception { assertEquals("Fail to start AD job", RestStatus.OK, TestHelpers.restStatus(startAdJobResponse)); - ToXContentObject[] results = getAnomalyDetector(detector.getDetectorId(), true, client()); + ToXContentObject[] results = getConfig(detector.getId(), true, client()); assertEquals("Incorrect Location header", detector, results[0]); - assertEquals("Incorrect detector job name", detector.getDetectorId(), ((AnomalyDetectorJob) results[1]).getName()); - assertTrue(((AnomalyDetectorJob) results[1]).isEnabled()); + assertEquals("Incorrect detector job name", detector.getId(), ((Job) results[1]).getName()); + assertTrue(((Job) results[1]).isEnabled()); - results = getAnomalyDetector(detector.getDetectorId(), false, client()); + results = getConfig(detector.getId(), false, client()); assertEquals("Incorrect Location header", detector, results[0]); assertEquals("Should not return detector job", null, results[1]); } @@ -843,7 +831,7 @@ public void testGetDetectorWithAdJob() throws Exception { public void testStartAdJobWithExistingDetector() throws Exception { AnomalyDetector detector = createRandomAnomalyDetector(true, false, client()); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, false); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, false); Exception ex = expectThrows( ResponseException.class, @@ -851,20 +839,20 @@ public void testStartAdJobWithExistingDetector() throws Exception { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_start", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_start", ImmutableMap.of(), "", null ) ); - assertThat(ex.getMessage(), containsString(CommonErrorMessages.DISABLED_ERR_MSG)); + assertThat(ex.getMessage(), containsString(ADCommonMessages.DISABLED_ERR_MSG)); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, true); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, true); Response startAdJobResponse = TestHelpers .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_start", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_start", ImmutableMap.of(), "", null @@ -876,7 +864,7 @@ public void testStartAdJobWithExistingDetector() throws Exception { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_start", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_start", ImmutableMap.of(), "", null @@ -907,7 +895,7 @@ public void testStartAdJobWithNonexistingDetector() throws Exception { TestHelpers .assertFailWith( ResponseException.class, - FAIL_TO_FIND_DETECTOR_MSG, + FAIL_TO_FIND_CONFIG_MSG, () -> TestHelpers .makeRequest( client(), @@ -921,20 +909,20 @@ public void testStartAdJobWithNonexistingDetector() throws Exception { } public void testStopAdJob() throws Exception { - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, true); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, true); AnomalyDetector detector = createRandomAnomalyDetector(true, false, client()); Response startAdJobResponse = TestHelpers .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_start", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_start", ImmutableMap.of(), "", null ); assertEquals("Fail to start AD job", RestStatus.OK, TestHelpers.restStatus(startAdJobResponse)); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, false); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, false); Exception ex = expectThrows( ResponseException.class, @@ -942,21 +930,21 @@ public void testStopAdJob() throws Exception { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_stop", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_stop", ImmutableMap.of(), "", null ) ); - assertThat(ex.getMessage(), containsString(CommonErrorMessages.DISABLED_ERR_MSG)); + assertThat(ex.getMessage(), containsString(ADCommonMessages.DISABLED_ERR_MSG)); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, true); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, true); Response stopAdJobResponse = TestHelpers .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_stop", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_stop", ImmutableMap.of(), "", null @@ -967,7 +955,7 @@ public void testStopAdJob() throws Exception { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_stop", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_stop", ImmutableMap.of(), "", null @@ -985,7 +973,7 @@ public void testStopNonExistingAdJobIndex() throws Exception { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_stop", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_stop", ImmutableMap.of(), "", null @@ -999,7 +987,7 @@ public void testStopNonExistingAdJob() throws Exception { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_start", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_start", ImmutableMap.of(), "", null @@ -1009,7 +997,7 @@ public void testStopNonExistingAdJob() throws Exception { TestHelpers .assertFailWith( ResponseException.class, - FAIL_TO_FIND_DETECTOR_MSG, + FAIL_TO_FIND_CONFIG_MSG, () -> TestHelpers .makeRequest( client(), @@ -1028,7 +1016,7 @@ public void testStartDisabledAdjob() throws IOException { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_start", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_start", ImmutableMap.of(), "", null @@ -1039,7 +1027,7 @@ public void testStartDisabledAdjob() throws IOException { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_stop", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_stop", ImmutableMap.of(), "", null @@ -1050,7 +1038,7 @@ public void testStartDisabledAdjob() throws IOException { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_start", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_start", ImmutableMap.of(), "", null @@ -1072,7 +1060,7 @@ public void testStartAdjobWithNullFeatures() throws Exception { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_start", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_start", ImmutableMap.of(), "", null @@ -1093,7 +1081,7 @@ public void testStartAdjobWithEmptyFeatures() throws Exception { .makeRequest( client(), "POST", - TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getDetectorId() + "/_start", + TestHelpers.AD_BASE_DETECTORS_URI + "/" + detector.getId() + "/_start", ImmutableMap.of(), "", null @@ -1104,26 +1092,26 @@ public void testStartAdjobWithEmptyFeatures() throws Exception { public void testDefaultProfileAnomalyDetector() throws Exception { AnomalyDetector detector = createRandomAnomalyDetector(true, true, client()); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, false); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, false); - Exception ex = expectThrows(ResponseException.class, () -> getDetectorProfile(detector.getDetectorId())); - assertThat(ex.getMessage(), containsString(CommonErrorMessages.DISABLED_ERR_MSG)); + Exception ex = expectThrows(ResponseException.class, () -> getDetectorProfile(detector.getId())); + assertThat(ex.getMessage(), containsString(ADCommonMessages.DISABLED_ERR_MSG)); - updateClusterSettings(EnabledSetting.AD_PLUGIN_ENABLED, true); + updateClusterSettings(ADEnabledSetting.AD_ENABLED, true); - Response profileResponse = getDetectorProfile(detector.getDetectorId()); + Response profileResponse = getDetectorProfile(detector.getId()); assertEquals("Incorrect profile status", RestStatus.OK, TestHelpers.restStatus(profileResponse)); } public void testAllProfileAnomalyDetector() throws Exception { AnomalyDetector detector = createRandomAnomalyDetector(true, true, client()); - Response profileResponse = getDetectorProfile(detector.getDetectorId(), true); + Response profileResponse = getDetectorProfile(detector.getId(), true); assertEquals("Incorrect profile status", RestStatus.OK, TestHelpers.restStatus(profileResponse)); } public void testCustomizedProfileAnomalyDetector() throws Exception { AnomalyDetector detector = createRandomAnomalyDetector(true, true, client()); - Response profileResponse = getDetectorProfile(detector.getDetectorId(), true, "/models/", client()); + Response profileResponse = getDetectorProfile(detector.getId(), true, "/models/", client()); assertEquals("Incorrect profile status", RestStatus.OK, TestHelpers.restStatus(profileResponse)); } @@ -1167,28 +1155,24 @@ public void testSearchAnomalyDetectorMatch() throws Exception { public void testRunDetectorWithNoEnabledFeature() throws Exception { AnomalyDetector detector = createRandomAnomalyDetector(true, true, client(), false); - Assert.assertNotNull(detector.getDetectorId()); + Assert.assertNotNull(detector.getId()); Instant now = Instant.now(); ResponseException e = expectThrows( ResponseException.class, - () -> startAnomalyDetector(detector.getDetectorId(), new DetectionDateRange(now.minus(10, ChronoUnit.DAYS), now), client()) + () -> startAnomalyDetector(detector.getId(), new DateRange(now.minus(10, ChronoUnit.DAYS), now), client()) ); assertTrue(e.getMessage().contains("Can't start detector job as no enabled features configured")); } public void testDeleteAnomalyDetectorWhileRunning() throws Exception { AnomalyDetector detector = createRandomAnomalyDetector(true, true, client()); - Assert.assertNotNull(detector.getDetectorId()); + Assert.assertNotNull(detector.getId()); Instant now = Instant.now(); - Response response = startAnomalyDetector( - detector.getDetectorId(), - new DetectionDateRange(now.minus(10, ChronoUnit.DAYS), now), - client() - ); - Assert.assertEquals(response.getStatusLine().toString(), "HTTP/1.1 200 OK"); + Response response = startAnomalyDetector(detector.getId(), new DateRange(now.minus(10, ChronoUnit.DAYS), now), client()); + org.hamcrest.MatcherAssert.assertThat(response.getStatusLine().toString(), CoreMatchers.containsString("200 OK")); // Deleting detector should fail while its running - Exception exception = expectThrows(IOException.class, () -> { deleteAnomalyDetector(detector.getDetectorId(), client()); }); + Exception exception = expectThrows(IOException.class, () -> { deleteAnomalyDetector(detector.getId(), client()); }); Assert.assertTrue(exception.getMessage().contains("Detector is running")); } @@ -1213,15 +1197,15 @@ public void testBackwardCompatibilityWithOpenDistro() throws IOException { assertTrue("incorrect version", version > 0); // Get the detector using new _plugins API - AnomalyDetector createdDetector = getAnomalyDetector(id, client()); - assertEquals("Get anomaly detector failed", createdDetector.getDetectorId(), id); + AnomalyDetector createdDetector = getConfig(id, client()); + assertEquals("Get anomaly detector failed", createdDetector.getId(), id); // Delete the detector using legacy _opendistro API response = TestHelpers .makeRequest( client(), "DELETE", - TestHelpers.LEGACY_OPENDISTRO_AD_BASE_DETECTORS_URI + "/" + createdDetector.getDetectorId(), + TestHelpers.LEGACY_OPENDISTRO_AD_BASE_DETECTORS_URI + "/" + createdDetector.getId(), ImmutableMap.of(), "", null @@ -1262,7 +1246,7 @@ public void testValidateAnomalyDetectorWithDuplicateName() throws Exception { Map> messageMap = (Map>) XContentMapValues .extractValue("detector", responseMap); assertEquals("Validation returned duplicate detector name message", RestStatus.OK, TestHelpers.restStatus(resp)); - String errorMsg = String.format(Locale.ROOT, DUPLICATE_DETECTOR_MSG, detector.getName(), "[" + detector.getDetectorId() + "]"); + String errorMsg = String.format(Locale.ROOT, DUPLICATE_DETECTOR_MSG, detector.getName(), "[" + detector.getId() + "]"); assertEquals("duplicate error message", errorMsg, messageMap.get("name").get("message")); } @@ -1289,7 +1273,7 @@ public void testValidateAnomalyDetectorWithNoTimeField() throws Exception { Map> messageMap = (Map>) XContentMapValues .extractValue("detector", responseMap); assertEquals("Validation response returned", RestStatus.OK, TestHelpers.restStatus(resp)); - assertEquals("time field missing", CommonErrorMessages.NULL_TIME_FIELD, messageMap.get("time_field").get("message")); + assertEquals("time field missing", CommonMessages.NULL_TIME_FIELD, messageMap.get("time_field").get("message")); } public void testValidateAnomalyDetectorWithIncorrectShingleSize() throws Exception { @@ -1323,7 +1307,7 @@ public void testValidateAnomalyDetectorWithIncorrectShingleSize() throws Excepti Map> messageMap = (Map>) XContentMapValues .extractValue("detector", responseMap); String errorMessage = "Shingle size must be a positive integer no larger than " - + AnomalyDetectorSettings.MAX_SHINGLE_SIZE + + TimeSeriesSettings.MAX_SHINGLE_SIZE + ". Got 2000"; assertEquals("shingle size error message", errorMessage, messageMap.get("shingle_size").get("message")); } @@ -1348,7 +1332,7 @@ public void testValidateAnomalyDetectorOnWrongValidationType() throws Exception TestHelpers .assertFailWith( ResponseException.class, - CommonErrorMessages.NOT_EXISTENT_VALIDATION_TYPE, + ADCommonMessages.NOT_EXISTENT_VALIDATION_TYPE, () -> TestHelpers .makeRequest( client(), @@ -1418,7 +1402,7 @@ public void testValidateAnomalyDetectorWithInvalidName() throws Exception { @SuppressWarnings("unchecked") Map> messageMap = (Map>) XContentMapValues .extractValue("detector", responseMap); - assertEquals("invalid detector Name", CommonErrorMessages.INVALID_DETECTOR_NAME, messageMap.get("name").get("message")); + assertEquals("invalid detector Name", CommonMessages.INVALID_NAME, messageMap.get("name").get("message")); } public void testValidateAnomalyDetectorWithFeatureQueryReturningNoData() throws Exception { @@ -1439,7 +1423,7 @@ public void testValidateAnomalyDetectorWithFeatureQueryReturningNoData() throws .extractValue("detector", responseMap); assertEquals( "empty data", - CommonErrorMessages.FEATURE_WITH_EMPTY_DATA_MSG + "f-empty", + CommonMessages.FEATURE_WITH_EMPTY_DATA_MSG + "f-empty", messageMap.get("feature_attributes").get("message") ); } @@ -1462,7 +1446,7 @@ public void testValidateAnomalyDetectorWithFeatureQueryRuntimeException() throws .extractValue("detector", responseMap); assertEquals( "runtime exception", - CommonErrorMessages.FEATURE_WITH_INVALID_QUERY_MSG + "non-numeric-feature", + CommonMessages.FEATURE_WITH_INVALID_QUERY_MSG + "non-numeric-feature", messageMap.get("feature_attributes").get("message") ); } @@ -1522,32 +1506,32 @@ public void testSearchTopAnomalyResultsWithInvalidInputs() throws IOException { // Missing start time Exception missingStartTimeException = expectThrows(IOException.class, () -> { - searchTopAnomalyResults(detector.getDetectorId(), false, "{\"end_time_ms\":2}", client()); + searchTopAnomalyResults(detector.getId(), false, "{\"end_time_ms\":2}", client()); }); assertTrue(missingStartTimeException.getMessage().contains("Must set both start time and end time with epoch of milliseconds")); // Missing end time Exception missingEndTimeException = expectThrows(IOException.class, () -> { - searchTopAnomalyResults(detector.getDetectorId(), false, "{\"start_time_ms\":1}", client()); + searchTopAnomalyResults(detector.getId(), false, "{\"start_time_ms\":1}", client()); }); assertTrue(missingEndTimeException.getMessage().contains("Must set both start time and end time with epoch of milliseconds")); // Start time > end time Exception invalidTimeException = expectThrows(IOException.class, () -> { - searchTopAnomalyResults(detector.getDetectorId(), false, "{\"start_time_ms\":2, \"end_time_ms\":1}", client()); + searchTopAnomalyResults(detector.getId(), false, "{\"start_time_ms\":2, \"end_time_ms\":1}", client()); }); assertTrue(invalidTimeException.getMessage().contains("Start time should be before end time")); // Invalid detector ID Exception invalidDetectorIdException = expectThrows(IOException.class, () -> { - searchTopAnomalyResults(detector.getDetectorId() + "-invalid", false, "{\"start_time_ms\":1, \"end_time_ms\":2}", client()); + searchTopAnomalyResults(detector.getId() + "-invalid", false, "{\"start_time_ms\":1, \"end_time_ms\":2}", client()); }); - assertTrue(invalidDetectorIdException.getMessage().contains("Can't find detector with id")); + assertTrue(invalidDetectorIdException.getMessage().contains("Can't find config with id")); // Invalid order field Exception invalidOrderException = expectThrows(IOException.class, () -> { searchTopAnomalyResults( - detector.getDetectorId(), + detector.getId(), false, "{\"start_time_ms\":1, \"end_time_ms\":2, \"order\":\"invalid-order\"}", client() @@ -1557,37 +1541,32 @@ public void testSearchTopAnomalyResultsWithInvalidInputs() throws IOException { // Negative size field Exception negativeSizeException = expectThrows(IOException.class, () -> { - searchTopAnomalyResults(detector.getDetectorId(), false, "{\"start_time_ms\":1, \"end_time_ms\":2, \"size\":-1}", client()); + searchTopAnomalyResults(detector.getId(), false, "{\"start_time_ms\":1, \"end_time_ms\":2, \"size\":-1}", client()); }); assertTrue(negativeSizeException.getMessage().contains("Size must be a positive integer")); // Zero size field Exception zeroSizeException = expectThrows(IOException.class, () -> { - searchTopAnomalyResults(detector.getDetectorId(), false, "{\"start_time_ms\":1, \"end_time_ms\":2, \"size\":0}", client()); + searchTopAnomalyResults(detector.getId(), false, "{\"start_time_ms\":1, \"end_time_ms\":2, \"size\":0}", client()); }); assertTrue(zeroSizeException.getMessage().contains("Size must be a positive integer")); // Too large size field Exception tooLargeSizeException = expectThrows(IOException.class, () -> { - searchTopAnomalyResults( - detector.getDetectorId(), - false, - "{\"start_time_ms\":1, \"end_time_ms\":2, \"size\":9999999}", - client() - ); + searchTopAnomalyResults(detector.getId(), false, "{\"start_time_ms\":1, \"end_time_ms\":2, \"size\":9999999}", client()); }); assertTrue(tooLargeSizeException.getMessage().contains("Size cannot exceed")); // No existing task ID for detector Exception noTaskIdException = expectThrows(IOException.class, () -> { - searchTopAnomalyResults(detector.getDetectorId(), true, "{\"start_time_ms\":1, \"end_time_ms\":2}", client()); + searchTopAnomalyResults(detector.getId(), true, "{\"start_time_ms\":1, \"end_time_ms\":2}", client()); }); - assertTrue(noTaskIdException.getMessage().contains("No historical tasks found for detector ID " + detector.getDetectorId())); + assertTrue(noTaskIdException.getMessage().contains("No historical tasks found for detector ID " + detector.getId())); // Invalid category fields Exception invalidCategoryFieldsException = expectThrows(IOException.class, () -> { searchTopAnomalyResults( - detector.getDetectorId(), + detector.getId(), false, "{\"start_time_ms\":1, \"end_time_ms\":2, \"category_field\":[\"invalid-field\"]}", client() @@ -1596,7 +1575,7 @@ public void testSearchTopAnomalyResultsWithInvalidInputs() throws IOException { assertTrue( invalidCategoryFieldsException .getMessage() - .contains("Category field invalid-field doesn't exist for detector ID " + detector.getDetectorId()) + .contains("Category field invalid-field doesn't exist for detector ID " + detector.getId()) ); // Using detector with no category fields @@ -1612,17 +1591,12 @@ public void testSearchTopAnomalyResultsWithInvalidInputs() throws IOException { client() ); Exception noCategoryFieldsException = expectThrows(IOException.class, () -> { - searchTopAnomalyResults( - detectorWithNoCategoryFields.getDetectorId(), - false, - "{\"start_time_ms\":1, \"end_time_ms\":2}", - client() - ); + searchTopAnomalyResults(detectorWithNoCategoryFields.getId(), false, "{\"start_time_ms\":1, \"end_time_ms\":2}", client()); }); assertTrue( noCategoryFieldsException .getMessage() - .contains("No category fields found for detector ID " + detectorWithNoCategoryFields.getDetectorId()) + .contains("No category fields found for detector ID " + detectorWithNoCategoryFields.getId()) ); } @@ -1650,12 +1624,11 @@ public void testSearchTopAnomalyResultsOnNonExistentResultIndex() throws IOExcep ); // Delete any existing result index - if (indexExistsWithAdminClient(CommonName.ANOMALY_RESULT_INDEX_ALIAS)) { - // need to provide concrete indices to delete. Otherwise, will get exceptions from OpenSearch core. - deleteIndexWithAdminClient(CommonName.ANOMALY_RESULT_INDEX_ALL); + if (indexExistsWithAdminClient(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS)) { + deleteIndexWithAdminClient(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS + "*"); } Response response = searchTopAnomalyResults( - detector.getDetectorId(), + detector.getId(), false, "{\"size\":3,\"category_field\":[\"keyword-field\"]," + "\"start_time_ms\":0, \"end_time_ms\":1}", client() @@ -1689,14 +1662,12 @@ public void testSearchTopAnomalyResultsOnEmptyResultIndex() throws IOException { client() ); - // Clear any existing result index, create an empty one - if (indexExistsWithAdminClient(CommonName.ANOMALY_RESULT_INDEX_ALIAS)) { - // need to provide concrete indices to delete. Otherwise, will get exceptions from OpenSearch core. - deleteIndexWithAdminClient(CommonName.ANOMALY_RESULT_INDEX_ALL); + if (indexExistsWithAdminClient(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS)) { + deleteIndexWithAdminClient(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS + "*"); } TestHelpers.createEmptyAnomalyResultIndex(adminClient()); Response response = searchTopAnomalyResults( - detector.getDetectorId(), + detector.getId(), false, "{\"size\":3,\"category_field\":[\"keyword-field\"]," + "\"start_time_ms\":0, \"end_time_ms\":1}", client() @@ -1731,7 +1702,7 @@ public void testSearchTopAnomalyResultsOnPopulatedResultIndex() throws IOExcepti ); // Ingest some sample results - if (!indexExistsWithAdminClient(CommonName.ANOMALY_RESULT_INDEX_ALIAS)) { + if (!indexExistsWithAdminClient(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS)) { TestHelpers.createEmptyAnomalyResultIndex(adminClient()); } Map entityAttrs1 = new HashMap() { @@ -1753,19 +1724,19 @@ public void testSearchTopAnomalyResultsOnPopulatedResultIndex() throws IOExcepti } }; AnomalyResult anomalyResult1 = TestHelpers - .randomHCADAnomalyDetectResult(detector.getDetectorId(), null, entityAttrs1, 0.5, 0.8, null, 5L, 5L); + .randomHCADAnomalyDetectResult(detector.getId(), null, entityAttrs1, 0.5, 0.8, null, 5L, 5L); AnomalyResult anomalyResult2 = TestHelpers - .randomHCADAnomalyDetectResult(detector.getDetectorId(), null, entityAttrs2, 0.5, 0.5, null, 5L, 5L); + .randomHCADAnomalyDetectResult(detector.getId(), null, entityAttrs2, 0.5, 0.5, null, 5L, 5L); AnomalyResult anomalyResult3 = TestHelpers - .randomHCADAnomalyDetectResult(detector.getDetectorId(), null, entityAttrs3, 0.5, 0.2, null, 5L, 5L); + .randomHCADAnomalyDetectResult(detector.getId(), null, entityAttrs3, 0.5, 0.2, null, 5L, 5L); - TestHelpers.ingestDataToIndex(adminClient(), CommonName.ANOMALY_RESULT_INDEX_ALIAS, TestHelpers.toHttpEntity(anomalyResult1)); - TestHelpers.ingestDataToIndex(adminClient(), CommonName.ANOMALY_RESULT_INDEX_ALIAS, TestHelpers.toHttpEntity(anomalyResult2)); - TestHelpers.ingestDataToIndex(adminClient(), CommonName.ANOMALY_RESULT_INDEX_ALIAS, TestHelpers.toHttpEntity(anomalyResult3)); + TestHelpers.ingestDataToIndex(adminClient(), ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, TestHelpers.toHttpEntity(anomalyResult1)); + TestHelpers.ingestDataToIndex(adminClient(), ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, TestHelpers.toHttpEntity(anomalyResult2)); + TestHelpers.ingestDataToIndex(adminClient(), ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, TestHelpers.toHttpEntity(anomalyResult3)); // Sorting by severity Response severityResponse = searchTopAnomalyResults( - detector.getDetectorId(), + detector.getId(), false, "{\"category_field\":[\"keyword-field\"]," + "\"start_time_ms\":0, \"end_time_ms\":10, \"order\":\"severity\"}", client() @@ -1784,7 +1755,7 @@ public void testSearchTopAnomalyResultsOnPopulatedResultIndex() throws IOExcepti // Sorting by occurrence Response occurrenceResponse = searchTopAnomalyResults( - detector.getDetectorId(), + detector.getId(), false, "{\"category_field\":[\"keyword-field\"]," + "\"start_time_ms\":0, \"end_time_ms\":10, \"order\":\"occurrence\"}", client() @@ -1803,7 +1774,7 @@ public void testSearchTopAnomalyResultsOnPopulatedResultIndex() throws IOExcepti // Sorting using all category fields Response allFieldsResponse = searchTopAnomalyResults( - detector.getDetectorId(), + detector.getId(), false, "{\"category_field\":[\"keyword-field\", \"ip-field\"]," + "\"start_time_ms\":0, \"end_time_ms\":10, \"order\":\"severity\"}", client() @@ -1825,7 +1796,7 @@ public void testSearchTopAnomalyResultsOnPopulatedResultIndex() throws IOExcepti public void testSearchTopAnomalyResultsWithCustomResultIndex() throws IOException { String indexName = randomAlphaOfLength(10).toLowerCase(Locale.ROOT); - String customResultIndexName = CommonName.CUSTOM_RESULT_INDEX_PREFIX + randomAlphaOfLength(5).toLowerCase(Locale.ROOT); + String customResultIndexName = ADCommonName.CUSTOM_RESULT_INDEX_PREFIX + randomAlphaOfLength(5).toLowerCase(Locale.ROOT); Map categoryFieldsAndTypes = new HashMap() { { put("keyword-field", "keyword"); @@ -1855,10 +1826,10 @@ public void testSearchTopAnomalyResultsWithCustomResultIndex() throws IOExceptio } }; AnomalyResult anomalyResult = TestHelpers - .randomHCADAnomalyDetectResult(detector.getDetectorId(), null, entityAttrs, 0.5, 0.8, null, 5L, 5L); + .randomHCADAnomalyDetectResult(detector.getId(), null, entityAttrs, 0.5, 0.8, null, 5L, 5L); TestHelpers.ingestDataToIndex(client(), customResultIndexName, TestHelpers.toHttpEntity(anomalyResult)); - Response response = searchTopAnomalyResults(detector.getDetectorId(), false, "{\"start_time_ms\":0, \"end_time_ms\":10}", client()); + Response response = searchTopAnomalyResults(detector.getId(), false, "{\"start_time_ms\":0, \"end_time_ms\":10}", client()); Map responseMap = entityAsMap(response); @SuppressWarnings("unchecked") List> buckets = (ArrayList>) XContentMapValues.extractValue("buckets", responseMap); diff --git a/src/test/java/org/opensearch/ad/rest/HistoricalAnalysisRestApiIT.java b/src/test/java/org/opensearch/ad/rest/HistoricalAnalysisRestApiIT.java index 4a8c77a23..3b7af33c2 100644 --- a/src/test/java/org/opensearch/ad/rest/HistoricalAnalysisRestApiIT.java +++ b/src/test/java/org/opensearch/ad/rest/HistoricalAnalysisRestApiIT.java @@ -11,14 +11,14 @@ package org.opensearch.ad.rest; -import static org.opensearch.ad.TestHelpers.AD_BASE_STATS_URI; -import static org.opensearch.ad.TestHelpers.HISTORICAL_ANALYSIS_FINISHED_FAILED_STATS; import static org.opensearch.ad.settings.AnomalyDetectorSettings.BATCH_TASK_PIECE_INTERVAL_SECONDS; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_RUNNING_ENTITIES_PER_DETECTOR_FOR_HISTORICAL_ANALYSIS; -import static org.opensearch.ad.stats.StatNames.AD_TOTAL_BATCH_TASK_EXECUTION_COUNT; -import static org.opensearch.ad.stats.StatNames.MULTI_ENTITY_DETECTOR_COUNT; -import static org.opensearch.ad.stats.StatNames.SINGLE_ENTITY_DETECTOR_COUNT; +import static org.opensearch.timeseries.TestHelpers.AD_BASE_STATS_URI; +import static org.opensearch.timeseries.TestHelpers.HISTORICAL_ANALYSIS_FINISHED_FAILED_STATS; +import static org.opensearch.timeseries.stats.StatNames.AD_TOTAL_BATCH_TASK_EXECUTION_COUNT; +import static org.opensearch.timeseries.stats.StatNames.MULTI_ENTITY_DETECTOR_COUNT; +import static org.opensearch.timeseries.stats.StatNames.SINGLE_ENTITY_DETECTOR_COUNT; import java.io.IOException; import java.util.List; @@ -30,17 +30,17 @@ import org.junit.Before; import org.junit.Ignore; import org.opensearch.ad.HistoricalAnalysisRestTestCase; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.ADTaskProfile; -import org.opensearch.ad.model.ADTaskState; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.client.Response; import org.opensearch.client.ResponseException; import org.opensearch.core.rest.RestStatus; import org.opensearch.core.xcontent.ToXContentObject; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.model.TaskState; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -66,7 +66,7 @@ public void testHistoricalAnalysisForSingleEntityDetector() throws Exception { } public void testHistoricalAnalysisForSingleEntityDetectorWithCustomResultIndex() throws Exception { - String resultIndex = CommonName.CUSTOM_RESULT_INDEX_PREFIX + randomAlphaOfLength(5).toLowerCase(Locale.ROOT); + String resultIndex = ADCommonName.CUSTOM_RESULT_INDEX_PREFIX + randomAlphaOfLength(5).toLowerCase(Locale.ROOT); List startHistoricalAnalysisResult = startHistoricalAnalysis(0, resultIndex); String detectorId = startHistoricalAnalysisResult.get(0); String taskId = startHistoricalAnalysisResult.get(1); @@ -109,7 +109,7 @@ private List startHistoricalAnalysis(int categoryFieldSize) throws Excep @SuppressWarnings("unchecked") private List startHistoricalAnalysis(int categoryFieldSize, String resultIndex) throws Exception { AnomalyDetector detector = createAnomalyDetector(categoryFieldSize, resultIndex); - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); // start historical detector String taskId = startHistoricalAnalysis(detectorId); @@ -117,8 +117,8 @@ private List startHistoricalAnalysis(int categoryFieldSize, String resul // get task profile ADTaskProfile adTaskProfile = waitUntilGetTaskProfile(detectorId); if (categoryFieldSize > 0) { - if (!ADTaskState.RUNNING.name().equals(adTaskProfile.getAdTask().getState())) { - adTaskProfile = (ADTaskProfile) waitUntilTaskReachState(detectorId, ImmutableSet.of(ADTaskState.RUNNING.name())).get(0); + if (!TaskState.RUNNING.name().equals(adTaskProfile.getAdTask().getState())) { + adTaskProfile = (ADTaskProfile) waitUntilTaskReachState(detectorId, ImmutableSet.of(TaskState.RUNNING.name())).get(0); } assertEquals((int) Math.pow(categoryFieldDocCount, categoryFieldSize), adTaskProfile.getTotalEntitiesCount().intValue()); assertTrue(adTaskProfile.getPendingEntitiesCount() > 0); @@ -145,7 +145,7 @@ private List startHistoricalAnalysis(int categoryFieldSize, String resul // get detector with AD task ToXContentObject[] result = getHistoricalAnomalyDetector(detectorId, true, client()); AnomalyDetector parsedDetector = (AnomalyDetector) result[0]; - AnomalyDetectorJob parsedJob = (AnomalyDetectorJob) result[1]; + Job parsedJob = (Job) result[1]; ADTask parsedADTask = (ADTask) result[2]; assertNull(parsedJob); assertNotNull(parsedDetector); @@ -159,7 +159,7 @@ private List startHistoricalAnalysis(int categoryFieldSize, String resul public void testStopHistoricalAnalysis() throws Exception { // create historical detector AnomalyDetector detector = createAnomalyDetector(); - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); // start historical detector String taskId = startHistoricalAnalysis(detectorId); @@ -171,7 +171,7 @@ public void testStopHistoricalAnalysis() throws Exception { assertEquals(RestStatus.OK, TestHelpers.restStatus(stopDetectorResponse)); // get task profile - checkIfTaskCanFinishCorrectly(detectorId, taskId, ImmutableSet.of(ADTaskState.STOPPED.name())); + checkIfTaskCanFinishCorrectly(detectorId, taskId, ImmutableSet.of(TaskState.STOPPED.name())); updateClusterSettings(BATCH_TASK_PIECE_INTERVAL_SECONDS.getKey(), 1); waitUntilTaskDone(detectorId); @@ -193,7 +193,7 @@ public void testStopHistoricalAnalysis() throws Exception { public void testUpdateHistoricalAnalysis() throws IOException, IllegalAccessException { // create historical detector AnomalyDetector detector = createAnomalyDetector(); - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); // update historical detector AnomalyDetector newDetector = randomAnomalyDetector(detector); @@ -207,11 +207,11 @@ public void testUpdateHistoricalAnalysis() throws IOException, IllegalAccessExce null ); Map responseBody = entityAsMap(updateResponse); - assertEquals(detector.getDetectorId(), responseBody.get("_id")); + assertEquals(detector.getId(), responseBody.get("_id")); assertEquals((detector.getVersion().intValue() + 1), (int) responseBody.get("_version")); // get historical detector - AnomalyDetector updatedDetector = getAnomalyDetector(detector.getDetectorId(), client()); + AnomalyDetector updatedDetector = getConfig(detector.getId(), client()); assertNotEquals(updatedDetector.getLastUpdateTime(), detector.getLastUpdateTime()); assertEquals(newDetector.getName(), updatedDetector.getName()); assertEquals(newDetector.getDescription(), updatedDetector.getDescription()); @@ -220,7 +220,7 @@ public void testUpdateHistoricalAnalysis() throws IOException, IllegalAccessExce public void testUpdateRunningHistoricalAnalysis() throws Exception { // create historical detector AnomalyDetector detector = createAnomalyDetector(); - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); // start historical detector startHistoricalAnalysis(detectorId); @@ -249,7 +249,7 @@ public void testUpdateRunningHistoricalAnalysis() throws Exception { public void testDeleteHistoricalAnalysis() throws IOException, IllegalAccessException { // create historical detector AnomalyDetector detector = createAnomalyDetector(); - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); // delete detector Response response = TestHelpers @@ -262,7 +262,7 @@ public void testDeleteHistoricalAnalysis() throws IOException, IllegalAccessExce public void testDeleteRunningHistoricalDetector() throws Exception { // create historical detector AnomalyDetector detector = createAnomalyDetector(); - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); // start historical detector startHistoricalAnalysis(detectorId); @@ -282,7 +282,7 @@ public void testDeleteRunningHistoricalDetector() throws Exception { public void testSearchTasks() throws IOException, InterruptedException, IllegalAccessException { // create historical detector AnomalyDetector detector = createAnomalyDetector(); - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); // start historical detector String taskId = startHistoricalAnalysis(detectorId); @@ -294,12 +294,12 @@ public void testSearchTasks() throws IOException, InterruptedException, IllegalA .makeRequest(client(), "POST", TestHelpers.AD_BASE_DETECTORS_URI + "/tasks/_search", ImmutableMap.of(), query, null); String searchResult = EntityUtils.toString(response.getEntity()); assertTrue(searchResult.contains(taskId)); - assertTrue(searchResult.contains(detector.getDetectorId())); + assertTrue(searchResult.contains(detector.getId())); } private AnomalyDetector randomAnomalyDetector(AnomalyDetector detector) { return new AnomalyDetector( - detector.getDetectorId(), + detector.getId(), null, randomAlphaOfLength(5), randomAlphaOfLength(5), @@ -307,15 +307,16 @@ private AnomalyDetector randomAnomalyDetector(AnomalyDetector detector) { detector.getIndices(), detector.getFeatureAttributes(), detector.getFilterQuery(), - detector.getDetectionInterval(), + detector.getInterval(), detector.getWindowDelay(), detector.getShingleSize(), detector.getUiMetadata(), detector.getSchemaVersion(), detector.getLastUpdateTime(), - detector.getCategoryField(), + detector.getCategoryFields(), detector.getUser(), - detector.getResultIndex() + detector.getCustomResultIndex(), + detector.getImputationOption() ); } diff --git a/src/test/java/org/opensearch/ad/rest/SecureADRestIT.java b/src/test/java/org/opensearch/ad/rest/SecureADRestIT.java index 850142726..9017ec898 100644 --- a/src/test/java/org/opensearch/ad/rest/SecureADRestIT.java +++ b/src/test/java/org/opensearch/ad/rest/SecureADRestIT.java @@ -22,20 +22,22 @@ import org.apache.http.HttpHeaders; import org.apache.http.HttpHost; import org.apache.http.message.BasicHeader; +import org.hamcrest.CoreMatchers; +import org.hamcrest.MatcherAssert; import org.junit.After; import org.junit.Assert; import org.junit.Before; import org.opensearch.ad.AnomalyDetectorRestTestCase; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyDetectorExecutionInput; -import org.opensearch.ad.model.DetectionDateRange; import org.opensearch.client.Response; import org.opensearch.client.RestClient; import org.opensearch.commons.authuser.User; import org.opensearch.commons.rest.SecureRestClientBuilder; import org.opensearch.core.rest.RestStatus; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.DateRange; import com.google.common.collect.ImmutableList; @@ -63,14 +65,18 @@ public class SecureADRestIT extends AnomalyDetectorRestTestCase { * Create an unguessable password. Simple password are weak due to https://tinyurl.com/383em9zk * @return a random password. */ - public static String generatePassword() { - String characters = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789"; + public static String generatePassword(String username) { + String characters = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789_"; Random rng = new Random(); - char[] password = new char[10]; - for (int i = 0; i < 10; i++) { - password[i] = characters.charAt(rng.nextInt(characters.length())); + char[] password = new char[15]; + for (int i = 0; i < 15; i++) { + char nextChar = characters.charAt(rng.nextInt(characters.length())); + while (username.indexOf(nextChar) > -1) { + nextChar = characters.charAt(rng.nextInt(characters.length())); + } + password[i] = nextChar; } return new String(password); @@ -82,49 +88,49 @@ public void setupSecureTests() throws IOException { throw new IllegalArgumentException("Secure Tests are running but HTTPS is not set"); createIndexRole(indexAllAccessRole, "*"); createSearchRole(indexSearchAccessRole, "*"); - String alicePassword = generatePassword(); + String alicePassword = generatePassword(aliceUser); createUser(aliceUser, alicePassword, new ArrayList<>(Arrays.asList("odfe"))); aliceClient = new SecureRestClientBuilder(getClusterHosts().toArray(new HttpHost[0]), isHttps(), aliceUser, alicePassword) .setSocketTimeout(60000) .build(); - String bobPassword = generatePassword(); + String bobPassword = generatePassword(bobUser); createUser(bobUser, bobPassword, new ArrayList<>(Arrays.asList("odfe"))); bobClient = new SecureRestClientBuilder(getClusterHosts().toArray(new HttpHost[0]), isHttps(), bobUser, bobPassword) .setSocketTimeout(60000) .build(); - String catPassword = generatePassword(); + String catPassword = generatePassword(catUser); createUser(catUser, catPassword, new ArrayList<>(Arrays.asList("aes"))); catClient = new SecureRestClientBuilder(getClusterHosts().toArray(new HttpHost[0]), isHttps(), catUser, catPassword) .setSocketTimeout(60000) .build(); - String dogPassword = generatePassword(); + String dogPassword = generatePassword(dogUser); createUser(dogUser, dogPassword, new ArrayList<>(Arrays.asList())); dogClient = new SecureRestClientBuilder(getClusterHosts().toArray(new HttpHost[0]), isHttps(), dogUser, dogPassword) .setSocketTimeout(60000) .build(); - String elkPassword = generatePassword(); + String elkPassword = generatePassword(elkUser); createUser(elkUser, elkPassword, new ArrayList<>(Arrays.asList("odfe"))); elkClient = new SecureRestClientBuilder(getClusterHosts().toArray(new HttpHost[0]), isHttps(), elkUser, elkPassword) .setSocketTimeout(60000) .build(); - String fishPassword = generatePassword(); + String fishPassword = generatePassword(fishUser); createUser(fishUser, fishPassword, new ArrayList<>(Arrays.asList("odfe", "aes"))); fishClient = new SecureRestClientBuilder(getClusterHosts().toArray(new HttpHost[0]), isHttps(), fishUser, fishPassword) .setSocketTimeout(60000) .build(); - String goatPassword = generatePassword(); + String goatPassword = generatePassword(goatUser); createUser(goatUser, goatPassword, new ArrayList<>(Arrays.asList("opensearch"))); goatClient = new SecureRestClientBuilder(getClusterHosts().toArray(new HttpHost[0]), isHttps(), goatUser, goatPassword) .setSocketTimeout(60000) .build(); - String lionPassword = generatePassword(); + String lionPassword = generatePassword(lionUser); createUser(lionUser, lionPassword, new ArrayList<>(Arrays.asList("opensearch"))); lionClient = new SecureRestClientBuilder(getClusterHosts().toArray(new HttpHost[0]), isHttps(), lionUser, lionPassword) .setSocketTimeout(60000) @@ -159,7 +165,7 @@ public void deleteUserSetup() throws IOException { public void testCreateAnomalyDetectorWithWriteAccess() throws IOException { // User Alice has AD full access, should be able to create a detector AnomalyDetector aliceDetector = createRandomAnomalyDetector(false, false, aliceClient); - Assert.assertNotNull("User alice could not create detector", aliceDetector.getDetectorId()); + Assert.assertNotNull("User alice could not create detector", aliceDetector.getId()); } public void testCreateAnomalyDetectorWithReadAccess() { @@ -171,33 +177,26 @@ public void testCreateAnomalyDetectorWithReadAccess() { public void testStartDetectorWithReadAccess() throws IOException { // User Bob has AD read access, should not be able to modify a detector AnomalyDetector aliceDetector = createRandomAnomalyDetector(false, false, aliceClient); - Assert.assertNotNull(aliceDetector.getDetectorId()); - Exception exception = expectThrows( - IOException.class, - () -> { startAnomalyDetector(aliceDetector.getDetectorId(), null, bobClient); } - ); + Assert.assertNotNull(aliceDetector.getId()); + Exception exception = expectThrows(IOException.class, () -> { startAnomalyDetector(aliceDetector.getId(), null, bobClient); }); Assert.assertTrue(exception.getMessage().contains("no permissions for [cluster:admin/opendistro/ad/detector/jobmanagement]")); } public void testStartDetectorForWriteUser() throws IOException { // User Alice has AD full access, should be able to modify a detector AnomalyDetector aliceDetector = createRandomAnomalyDetector(false, false, aliceClient); - Assert.assertNotNull(aliceDetector.getDetectorId()); + Assert.assertNotNull(aliceDetector.getId()); Instant now = Instant.now(); - Response response = startAnomalyDetector( - aliceDetector.getDetectorId(), - new DetectionDateRange(now.minus(10, ChronoUnit.DAYS), now), - aliceClient - ); - Assert.assertEquals(response.getStatusLine().toString(), "HTTP/1.1 200 OK"); + Response response = startAnomalyDetector(aliceDetector.getId(), new DateRange(now.minus(10, ChronoUnit.DAYS), now), aliceClient); + MatcherAssert.assertThat(response.getStatusLine().toString(), CoreMatchers.containsString("200 OK")); } public void testFilterByDisabled() throws IOException { // User Alice has AD full access, should be able to create a detector AnomalyDetector aliceDetector = createRandomAnomalyDetector(false, false, aliceClient); // User Cat has AD full access, should be able to get a detector - AnomalyDetector detector = getAnomalyDetector(aliceDetector.getDetectorId(), catClient); - Assert.assertEquals(aliceDetector.getDetectorId(), detector.getDetectorId()); + AnomalyDetector detector = getConfig(aliceDetector.getId(), catClient); + Assert.assertEquals(aliceDetector.getId(), detector.getId()); } public void testGetApiFilterByEnabled() throws IOException { @@ -206,11 +205,8 @@ public void testGetApiFilterByEnabled() throws IOException { enableFilterBy(); // User Cat has AD full access, but is part of different backend role so Cat should not be able to access // Alice detector - Exception exception = expectThrows(IOException.class, () -> { getAnomalyDetector(aliceDetector.getDetectorId(), catClient); }); - Assert - .assertTrue( - exception.getMessage().contains("User does not have permissions to access detector: " + aliceDetector.getDetectorId()) - ); + Exception exception = expectThrows(IOException.class, () -> { getConfig(aliceDetector.getId(), catClient); }); + Assert.assertTrue(exception.getMessage().contains("User does not have permissions to access config: " + aliceDetector.getId())); } private void confirmingClientIsAdmin() throws IOException { @@ -233,7 +229,7 @@ public void testGetApiFilterByEnabledForAdmin() throws IOException { AnomalyDetector aliceDetector = createRandomAnomalyDetector(false, false, aliceClient); enableFilterBy(); confirmingClientIsAdmin(); - AnomalyDetector detector = getAnomalyDetector(aliceDetector.getDetectorId(), client()); + AnomalyDetector detector = getConfig(aliceDetector.getId(), client()); Assert .assertArrayEquals( "User backend role of detector doesn't change", @@ -248,7 +244,7 @@ public void testUpdateApiFilterByEnabledForAdmin() throws IOException { enableFilterBy(); AnomalyDetector newDetector = new AnomalyDetector( - aliceDetector.getDetectorId(), + aliceDetector.getId(), aliceDetector.getVersion(), aliceDetector.getName(), randomAlphaOfLength(10), @@ -256,26 +252,27 @@ public void testUpdateApiFilterByEnabledForAdmin() throws IOException { aliceDetector.getIndices(), aliceDetector.getFeatureAttributes(), aliceDetector.getFilterQuery(), - aliceDetector.getDetectionInterval(), + aliceDetector.getInterval(), aliceDetector.getWindowDelay(), aliceDetector.getShingleSize(), aliceDetector.getUiMetadata(), aliceDetector.getSchemaVersion(), Instant.now(), - aliceDetector.getCategoryField(), + aliceDetector.getCategoryFields(), new User( randomAlphaOfLength(5), ImmutableList.of("odfe", randomAlphaOfLength(5)), ImmutableList.of(randomAlphaOfLength(5)), ImmutableList.of(randomAlphaOfLength(5)) ), - null + null, + aliceDetector.getImputationOption() ); // User client has admin all access, and has "opensearch" backend role so client should be able to update detector // But the detector's backend role should not be replaced as client's backend roles (all_access). - Response response = updateAnomalyDetector(aliceDetector.getDetectorId(), newDetector, client()); + Response response = updateAnomalyDetector(aliceDetector.getId(), newDetector, client()); Assert.assertEquals(response.getStatusLine().getStatusCode(), 200); - AnomalyDetector anomalyDetector = getAnomalyDetector(aliceDetector.getDetectorId(), aliceClient); + AnomalyDetector anomalyDetector = getConfig(aliceDetector.getId(), aliceClient); Assert .assertArrayEquals( "odfe is still the backendrole, not opensearch", @@ -294,7 +291,7 @@ public void testUpdateApiFilterByEnabled() throws IOException { aliceDetector.getUser().getBackendRoles().toArray(new String[0]) ); AnomalyDetector newDetector = new AnomalyDetector( - aliceDetector.getDetectorId(), + aliceDetector.getId(), aliceDetector.getVersion(), aliceDetector.getName(), randomAlphaOfLength(10), @@ -302,28 +299,29 @@ public void testUpdateApiFilterByEnabled() throws IOException { aliceDetector.getIndices(), aliceDetector.getFeatureAttributes(), aliceDetector.getFilterQuery(), - aliceDetector.getDetectionInterval(), + aliceDetector.getInterval(), aliceDetector.getWindowDelay(), aliceDetector.getShingleSize(), aliceDetector.getUiMetadata(), aliceDetector.getSchemaVersion(), Instant.now(), - aliceDetector.getCategoryField(), + aliceDetector.getCategoryFields(), new User( randomAlphaOfLength(5), ImmutableList.of("odfe", randomAlphaOfLength(5)), ImmutableList.of(randomAlphaOfLength(5)), ImmutableList.of(randomAlphaOfLength(5)) ), - null + null, + aliceDetector.getImputationOption() ); enableFilterBy(); // User Fish has AD full access, and has "odfe" backend role which is one of Alice's backend role, so // Fish should be able to update detectors created by Alice. But the detector's backend role should // not be replaced as Fish's backend roles. - Response response = updateAnomalyDetector(aliceDetector.getDetectorId(), newDetector, fishClient); + Response response = updateAnomalyDetector(aliceDetector.getId(), newDetector, fishClient); Assert.assertEquals(response.getStatusLine().getStatusCode(), 200); - AnomalyDetector anomalyDetector = getAnomalyDetector(aliceDetector.getDetectorId(), aliceClient); + AnomalyDetector anomalyDetector = getConfig(aliceDetector.getId(), aliceClient); Assert .assertArrayEquals( "Wrong user roles", @@ -340,12 +338,9 @@ public void testStartApiFilterByEnabled() throws IOException { // Alice detector Instant now = Instant.now(); Exception exception = expectThrows(IOException.class, () -> { - startAnomalyDetector(aliceDetector.getDetectorId(), new DetectionDateRange(now.minus(10, ChronoUnit.DAYS), now), catClient); + startAnomalyDetector(aliceDetector.getId(), new DateRange(now.minus(10, ChronoUnit.DAYS), now), catClient); }); - Assert - .assertTrue( - exception.getMessage().contains("User does not have permissions to access detector: " + aliceDetector.getDetectorId()) - ); + Assert.assertTrue(exception.getMessage().contains("User does not have permissions to access config: " + aliceDetector.getId())); } public void testStopApiFilterByEnabled() throws IOException { @@ -354,14 +349,8 @@ public void testStopApiFilterByEnabled() throws IOException { enableFilterBy(); // User Cat has AD full access, but is part of different backend role so Cat should not be able to access // Alice detector - Exception exception = expectThrows( - IOException.class, - () -> { stopAnomalyDetector(aliceDetector.getDetectorId(), catClient, true); } - ); - Assert - .assertTrue( - exception.getMessage().contains("User does not have permissions to access detector: " + aliceDetector.getDetectorId()) - ); + Exception exception = expectThrows(IOException.class, () -> { stopAnomalyDetector(aliceDetector.getId(), catClient, true); }); + Assert.assertTrue(exception.getMessage().contains("User does not have permissions to access config: " + aliceDetector.getId())); } public void testDeleteApiFilterByEnabled() throws IOException { @@ -370,11 +359,8 @@ public void testDeleteApiFilterByEnabled() throws IOException { enableFilterBy(); // User Cat has AD full access, but is part of different backend role so Cat should not be able to access // Alice detector - Exception exception = expectThrows(IOException.class, () -> { deleteAnomalyDetector(aliceDetector.getDetectorId(), catClient); }); - Assert - .assertTrue( - exception.getMessage().contains("User does not have permissions to access detector: " + aliceDetector.getDetectorId()) - ); + Exception exception = expectThrows(IOException.class, () -> { deleteAnomalyDetector(aliceDetector.getId(), catClient); }); + Assert.assertTrue(exception.getMessage().contains("User does not have permissions to access config: " + aliceDetector.getId())); } public void testCreateAnomalyDetectorWithNoBackendRole() throws IOException { @@ -403,29 +389,29 @@ public void testCreateAnomalyDetectorWithCustomResultIndex() throws IOException AnomalyDetector anomalyDetector = createRandomAnomalyDetector(false, false, aliceClient); // User elk has AD full access, but has no read permission of index - String resultIndex = CommonName.CUSTOM_RESULT_INDEX_PREFIX + "test"; + String resultIndex = ADCommonName.CUSTOM_RESULT_INDEX_PREFIX + "test"; AnomalyDetector detector = cloneDetector(anomalyDetector, resultIndex); // User goat has no permission to create index Exception exception = expectThrows(IOException.class, () -> { createAnomalyDetector(detector, true, goatClient); }); Assert.assertTrue(exception.getMessage().contains("no permissions for [indices:admin/create]")); // User cat has permission to create index - resultIndex = CommonName.CUSTOM_RESULT_INDEX_PREFIX + "test2"; + resultIndex = ADCommonName.CUSTOM_RESULT_INDEX_PREFIX + "test2"; TestHelpers.createIndexWithTimeField(client(), anomalyDetector.getIndices().get(0), anomalyDetector.getTimeField()); AnomalyDetector detectorOfCat = createAnomalyDetector(cloneDetector(anomalyDetector, resultIndex), true, catClient); - assertEquals(resultIndex, detectorOfCat.getResultIndex()); + assertEquals(resultIndex, detectorOfCat.getCustomResultIndex()); } public void testPreviewAnomalyDetectorWithWriteAccess() throws IOException { // User Alice has AD full access, should be able to create/preview a detector AnomalyDetector aliceDetector = createRandomAnomalyDetector(false, false, aliceClient); AnomalyDetectorExecutionInput input = new AnomalyDetectorExecutionInput( - aliceDetector.getDetectorId(), + aliceDetector.getId(), Instant.now().minusSeconds(60 * 10), Instant.now(), null ); - Response response = previewAnomalyDetector(aliceDetector.getDetectorId(), aliceClient, input); + Response response = previewAnomalyDetector(aliceDetector.getId(), aliceClient, input); Assert.assertEquals(RestStatus.OK, TestHelpers.restStatus(response)); } @@ -439,10 +425,7 @@ public void testPreviewAnomalyDetectorWithReadAccess() throws IOException { null ); // User bob has AD read access, should not be able to preview a detector - Exception exception = expectThrows( - IOException.class, - () -> { previewAnomalyDetector(aliceDetector.getDetectorId(), bobClient, input); } - ); + Exception exception = expectThrows(IOException.class, () -> { previewAnomalyDetector(aliceDetector.getId(), bobClient, input); }); Assert.assertTrue(exception.getMessage().contains("no permissions for [cluster:admin/opendistro/ad/detector/preview]")); } @@ -450,7 +433,7 @@ public void testPreviewAnomalyDetectorWithFilterEnabled() throws IOException { // User Alice has AD full access, should be able to create a detector AnomalyDetector aliceDetector = createRandomAnomalyDetector(false, false, aliceClient); AnomalyDetectorExecutionInput input = new AnomalyDetectorExecutionInput( - aliceDetector.getDetectorId(), + aliceDetector.getId(), Instant.now().minusSeconds(60 * 10), Instant.now(), null @@ -458,31 +441,22 @@ public void testPreviewAnomalyDetectorWithFilterEnabled() throws IOException { enableFilterBy(); // User Cat has AD full access, but is part of different backend role so Cat should not be able to access // Alice detector - Exception exception = expectThrows( - IOException.class, - () -> { previewAnomalyDetector(aliceDetector.getDetectorId(), catClient, input); } - ); - Assert - .assertTrue( - exception.getMessage().contains("User does not have permissions to access detector: " + aliceDetector.getDetectorId()) - ); + Exception exception = expectThrows(IOException.class, () -> { previewAnomalyDetector(aliceDetector.getId(), catClient, input); }); + Assert.assertTrue(exception.getMessage().contains("User does not have permissions to access config: " + aliceDetector.getId())); } public void testPreviewAnomalyDetectorWithNoReadPermissionOfIndex() throws IOException { // User Alice has AD full access, should be able to create a detector AnomalyDetector aliceDetector = createRandomAnomalyDetector(false, false, aliceClient); AnomalyDetectorExecutionInput input = new AnomalyDetectorExecutionInput( - aliceDetector.getDetectorId(), + aliceDetector.getId(), Instant.now().minusSeconds(60 * 10), Instant.now(), aliceDetector ); enableFilterBy(); // User elk has no read permission of index - Exception exception = expectThrows( - Exception.class, - () -> { previewAnomalyDetector(aliceDetector.getDetectorId(), elkClient, input); } - ); + Exception exception = expectThrows(Exception.class, () -> { previewAnomalyDetector(aliceDetector.getId(), elkClient, input); }); Assert .assertTrue( "actual msg: " + exception.getMessage(), diff --git a/src/test/java/org/opensearch/ad/rest/handler/IndexAnomalyDetectorJobActionHandlerTests.java b/src/test/java/org/opensearch/ad/rest/handler/IndexAnomalyDetectorJobActionHandlerTests.java index 63f394d2e..3bb0f1fbb 100644 --- a/src/test/java/org/opensearch/ad/rest/handler/IndexAnomalyDetectorJobActionHandlerTests.java +++ b/src/test/java/org/opensearch/ad/rest/handler/IndexAnomalyDetectorJobActionHandlerTests.java @@ -21,7 +21,7 @@ import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; import static org.opensearch.action.DocWriteResponse.Result.CREATED; -import static org.opensearch.ad.constant.CommonErrorMessages.CAN_NOT_FIND_LATEST_TASK; +import static org.opensearch.timeseries.constant.CommonMessages.CAN_NOT_FIND_LATEST_TASK; import java.io.IOException; import java.util.Arrays; @@ -34,26 +34,19 @@ import org.opensearch.action.index.IndexResponse; import org.opensearch.action.update.UpdateResponse; import org.opensearch.ad.ExecuteADResultResponseRecorder; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.common.exception.InternalFailure; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.mock.model.MockSimpleLog; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.model.Feature; import org.opensearch.ad.task.ADTaskCacheManager; import org.opensearch.ad.task.ADTaskManager; -import org.opensearch.ad.transport.AnomalyDetectorJobResponse; import org.opensearch.ad.transport.AnomalyResultAction; import org.opensearch.ad.transport.AnomalyResultResponse; import org.opensearch.ad.transport.ProfileAction; import org.opensearch.ad.transport.ProfileResponse; import org.opensearch.ad.transport.handler.AnomalyIndexHandler; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.Client; import org.opensearch.common.unit.TimeValue; import org.opensearch.core.action.ActionListener; @@ -61,13 +54,20 @@ import org.opensearch.search.aggregations.AggregationBuilder; import org.opensearch.test.OpenSearchTestCase; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.InternalFailure; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.transport.JobResponse; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; import org.opensearch.transport.TransportService; import com.google.common.collect.ImmutableList; public class IndexAnomalyDetectorJobActionHandlerTests extends OpenSearchTestCase { - private static AnomalyDetectionIndices anomalyDetectionIndices; + private static ADIndexManagement anomalyDetectionIndices; private static String detectorId; private static Long seqNo; private static Long primaryTerm; @@ -94,12 +94,12 @@ public static void setOnce() throws IOException { detectorId = "123"; seqNo = 1L; primaryTerm = 2L; - anomalyDetectionIndices = mock(AnomalyDetectionIndices.class); + anomalyDetectionIndices = mock(ADIndexManagement.class); xContentRegistry = NamedXContentRegistry.EMPTY; transportService = mock(TransportService.class); requestTimeout = TimeValue.timeValueMinutes(60); - when(anomalyDetectionIndices.doesAnomalyDetectorJobIndexExist()).thenReturn(true); + when(anomalyDetectionIndices.doesJobIndexExist()).thenReturn(true); nodeFilter = mock(DiscoveryNodeFilterer.class); detector = TestHelpers.randomAnomalyDetectorUsingCategoryFields(detectorId, Arrays.asList("a")); @@ -146,9 +146,9 @@ public void setUp() throws Exception { adTaskManager = mock(ADTaskManager.class); doAnswer(invocation -> { Object[] args = invocation.getArguments(); - ActionListener listener = (ActionListener) args[4]; + ActionListener listener = (ActionListener) args[4]; - AnomalyDetectorJobResponse response = mock(AnomalyDetectorJobResponse.class); + JobResponse response = mock(JobResponse.class); listener.onResponse(response); return null; @@ -193,7 +193,7 @@ public void setUp() throws Exception { public void testDelayHCProfile() { when(adTaskManager.isHCRealtimeTaskStartInitializing(anyString())).thenReturn(false); - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); handler.startAnomalyDetectorJob(detector, listener); @@ -220,7 +220,7 @@ public void testNoDelayHCProfile() { when(adTaskManager.isHCRealtimeTaskStartInitializing(anyString())).thenReturn(true); - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); handler.startAnomalyDetectorJob(detector, listener); @@ -246,7 +246,7 @@ public void testHCProfileException() { when(adTaskManager.isHCRealtimeTaskStartInitializing(anyString())).thenReturn(true); - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); handler.startAnomalyDetectorJob(detector, listener); @@ -283,7 +283,7 @@ public void testUpdateLatestRealtimeTaskOnCoordinatingNodeResourceNotFoundExcept return null; }).when(adTaskManager).updateLatestRealtimeTaskOnCoordinatingNode(any(), any(), any(), any(), any(), any()); - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); handler.startAnomalyDetectorJob(detector, listener); @@ -321,7 +321,7 @@ public void testUpdateLatestRealtimeTaskOnCoordinatingException() { return null; }).when(adTaskManager).updateLatestRealtimeTaskOnCoordinatingNode(any(), any(), any(), any(), any(), any()); - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); handler.startAnomalyDetectorJob(detector, listener); @@ -342,12 +342,12 @@ public void testIndexException() throws IOException { Object[] args = invocation.getArguments(); ActionListener listener = (ActionListener) args[2]; - listener.onFailure(new InternalFailure(detectorId, CommonErrorMessages.NO_MODEL_ERR_MSG)); + listener.onFailure(new InternalFailure(detectorId, ADCommonMessages.NO_MODEL_ERR_MSG)); return null; }).when(client).execute(any(AnomalyResultAction.class), any(), any()); - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); AggregationBuilder aggregationBuilder = TestHelpers .parseAggregation("{\"test\":{\"max\":{\"field\":\"" + MockSimpleLog.VALUE_FIELD + "\"}}}"); Feature feature = new Feature(randomAlphaOfLength(5), randomAlphaOfLength(10), true, aggregationBuilder); @@ -358,7 +358,7 @@ public void testIndexException() throws IOException { 10, MockSimpleLog.TIME_FIELD, null, - CommonName.CUSTOM_RESULT_INDEX_PREFIX + "index" + ADCommonName.CUSTOM_RESULT_INDEX_PREFIX + "index" ); when(anomalyDetectionIndices.doesIndexExist(anyString())).thenReturn(false); handler.startAnomalyDetectorJob(detector, listener); diff --git a/src/test/java/org/opensearch/ad/settings/ADEnabledSettingTests.java b/src/test/java/org/opensearch/ad/settings/ADEnabledSettingTests.java new file mode 100644 index 000000000..6de90a068 --- /dev/null +++ b/src/test/java/org/opensearch/ad/settings/ADEnabledSettingTests.java @@ -0,0 +1,75 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.ad.settings; + +import static org.mockito.Mockito.mock; +import static org.opensearch.common.settings.Setting.Property.Dynamic; + +import java.util.Collections; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +import org.mockito.Mockito; +import org.opensearch.cluster.service.ClusterService; +import org.opensearch.common.settings.ClusterSettings; +import org.opensearch.common.settings.Setting; +import org.opensearch.common.settings.Settings; +import org.opensearch.test.OpenSearchTestCase; + +public class ADEnabledSettingTests extends OpenSearchTestCase { + + public void testIsADEnabled() { + assertTrue(ADEnabledSetting.isADEnabled()); + ADEnabledSetting.getInstance().setSettingValue(ADEnabledSetting.AD_ENABLED, false); + assertTrue(!ADEnabledSetting.isADEnabled()); + ADEnabledSetting.getInstance().setSettingValue(ADEnabledSetting.AD_ENABLED, true); + } + + public void testIsADBreakerEnabled() { + assertTrue(ADEnabledSetting.isADBreakerEnabled()); + ADEnabledSetting.getInstance().setSettingValue(ADEnabledSetting.AD_BREAKER_ENABLED, false); + assertTrue(!ADEnabledSetting.isADBreakerEnabled()); + } + + public void testIsInterpolationInColdStartEnabled() { + assertTrue(!ADEnabledSetting.isInterpolationInColdStartEnabled()); + ADEnabledSetting.getInstance().setSettingValue(ADEnabledSetting.INTERPOLATION_IN_HCAD_COLD_START_ENABLED, true); + assertTrue(ADEnabledSetting.isInterpolationInColdStartEnabled()); + } + + public void testIsDoorKeeperInCacheEnabled() { + assertTrue(!ADEnabledSetting.isDoorKeeperInCacheEnabled()); + ADEnabledSetting.getInstance().setSettingValue(ADEnabledSetting.DOOR_KEEPER_IN_CACHE_ENABLED, true); + assertTrue(ADEnabledSetting.isDoorKeeperInCacheEnabled()); + } + + public void testSetSettingsUpdateConsumers() { + Setting testSetting = Setting.boolSetting("test.setting", true, Setting.Property.NodeScope, Dynamic); + Map> settings = new HashMap<>(); + settings.put("test.setting", testSetting); + ADEnabledSetting dynamicNumericSetting = new ADEnabledSetting(settings); + ClusterSettings clusterSettings = new ClusterSettings(Settings.EMPTY, Collections.singleton(testSetting)); + ClusterService clusterService = mock(ClusterService.class); + Mockito.when(clusterService.getClusterSettings()).thenReturn(clusterSettings); + + dynamicNumericSetting.init(clusterService); + + assertEquals(true, dynamicNumericSetting.getSettingValue("test.setting")); + } + + public void testGetSettings() { + Setting testSetting1 = Setting.boolSetting("test.setting1", true, Setting.Property.NodeScope); + Setting testSetting2 = Setting.boolSetting("test.setting2", false, Setting.Property.NodeScope); + Map> settings = new HashMap<>(); + settings.put("test.setting1", testSetting1); + settings.put("test.setting2", testSetting2); + ADEnabledSetting dynamicNumericSetting = new ADEnabledSetting(settings); + List> returnedSettings = dynamicNumericSetting.getSettings(); + assertEquals(2, returnedSettings.size()); + assertTrue(returnedSettings.containsAll(settings.values())); + } +} diff --git a/src/test/java/org/opensearch/ad/settings/ADNumericSettingTests.java b/src/test/java/org/opensearch/ad/settings/ADNumericSettingTests.java new file mode 100644 index 000000000..71d131641 --- /dev/null +++ b/src/test/java/org/opensearch/ad/settings/ADNumericSettingTests.java @@ -0,0 +1,50 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.ad.settings; + +import java.util.HashMap; +import java.util.Map; + +import org.junit.Before; +import org.opensearch.common.settings.Setting; +import org.opensearch.test.OpenSearchTestCase; + +public class ADNumericSettingTests extends OpenSearchTestCase { + private ADNumericSetting adSetting; + + @Override + @Before + public void setUp() throws Exception { + super.setUp(); + adSetting = ADNumericSetting.getInstance(); + } + + public void testMaxCategoricalFields() { + adSetting.setSettingValue(ADNumericSetting.CATEGORY_FIELD_LIMIT, 3); + int value = ADNumericSetting.maxCategoricalFields(); + assertEquals("Expected value is 3", 3, value); + } + + public void testGetSettingValue() { + Map> settingsMap = new HashMap<>(); + Setting testSetting = Setting.intSetting("test.setting", 1, Setting.Property.NodeScope); + settingsMap.put("test.setting", testSetting); + adSetting = new ADNumericSetting(settingsMap); + + adSetting.setSettingValue("test.setting", 2); + Integer value = adSetting.getSettingValue("test.setting"); + assertEquals("Expected value is 2", 2, value.intValue()); + } + + public void testGetSettingNonexistentKey() { + try { + adSetting.getSettingValue("nonexistent.key"); + fail("Expected an IllegalArgumentException to be thrown"); + } catch (IllegalArgumentException e) { + assertEquals("Cannot find setting by key [nonexistent.key]", e.getMessage()); + } + } +} diff --git a/src/test/java/org/opensearch/ad/settings/AnomalyDetectorSettingsTests.java b/src/test/java/org/opensearch/ad/settings/AnomalyDetectorSettingsTests.java index 95fdbc40b..085ea5959 100644 --- a/src/test/java/org/opensearch/ad/settings/AnomalyDetectorSettingsTests.java +++ b/src/test/java/org/opensearch/ad/settings/AnomalyDetectorSettingsTests.java @@ -15,19 +15,20 @@ import java.util.List; import org.junit.Before; -import org.opensearch.ad.AnomalyDetectorPlugin; import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.common.unit.TimeValue; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.settings.TimeSeriesSettings; @SuppressWarnings({ "rawtypes" }) public class AnomalyDetectorSettingsTests extends OpenSearchTestCase { - AnomalyDetectorPlugin plugin; + TimeSeriesAnalyticsPlugin plugin; @Before public void setup() { - this.plugin = new AnomalyDetectorPlugin(); + this.plugin = new TimeSeriesAnalyticsPlugin(); } public void testAllLegacyOpenDistroSettingsReturned() { @@ -57,7 +58,7 @@ public void testAllLegacyOpenDistroSettingsReturned() { LegacyOpenDistroAnomalyDetectorSettings.MAX_ENTITIES_FOR_PREVIEW, LegacyOpenDistroAnomalyDetectorSettings.INDEX_PRESSURE_SOFT_LIMIT, LegacyOpenDistroAnomalyDetectorSettings.MAX_PRIMARY_SHARDS, - LegacyOpenDistroAnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES, + LegacyOpenDistroAnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES, LegacyOpenDistroAnomalyDetectorSettings.MAX_CACHE_MISS_HANDLING_PER_SECOND, LegacyOpenDistroAnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE, LegacyOpenDistroAnomalyDetectorSettings.BATCH_TASK_PIECE_INTERVAL_SECONDS, @@ -76,47 +77,49 @@ public void testAllOpenSearchSettingsReturned() { .containsAll( Arrays .asList( - AnomalyDetectorSettings.MAX_SINGLE_ENTITY_ANOMALY_DETECTORS, - AnomalyDetectorSettings.MAX_MULTI_ENTITY_ANOMALY_DETECTORS, + AnomalyDetectorSettings.AD_MAX_SINGLE_ENTITY_ANOMALY_DETECTORS, + AnomalyDetectorSettings.AD_MAX_HC_ANOMALY_DETECTORS, AnomalyDetectorSettings.MAX_ANOMALY_FEATURES, - AnomalyDetectorSettings.REQUEST_TIMEOUT, + AnomalyDetectorSettings.AD_REQUEST_TIMEOUT, AnomalyDetectorSettings.DETECTION_INTERVAL, AnomalyDetectorSettings.DETECTION_WINDOW_DELAY, AnomalyDetectorSettings.AD_RESULT_HISTORY_ROLLOVER_PERIOD, AnomalyDetectorSettings.AD_RESULT_HISTORY_MAX_DOCS_PER_SHARD, - AnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE, - AnomalyDetectorSettings.COOLDOWN_MINUTES, - AnomalyDetectorSettings.BACKOFF_MINUTES, - AnomalyDetectorSettings.BACKOFF_INITIAL_DELAY, - AnomalyDetectorSettings.MAX_RETRY_FOR_BACKOFF, + AnomalyDetectorSettings.AD_MAX_RETRY_FOR_UNRESPONSIVE_NODE, + AnomalyDetectorSettings.AD_COOLDOWN_MINUTES, + AnomalyDetectorSettings.AD_BACKOFF_MINUTES, + AnomalyDetectorSettings.AD_BACKOFF_INITIAL_DELAY, + AnomalyDetectorSettings.AD_MAX_RETRY_FOR_BACKOFF, AnomalyDetectorSettings.AD_RESULT_HISTORY_RETENTION_PERIOD, - AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE, - AnomalyDetectorSettings.MAX_ENTITIES_PER_QUERY, + AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE, + AnomalyDetectorSettings.AD_MAX_ENTITIES_PER_QUERY, AnomalyDetectorSettings.MAX_ENTITIES_FOR_PREVIEW, - AnomalyDetectorSettings.INDEX_PRESSURE_SOFT_LIMIT, - AnomalyDetectorSettings.INDEX_PRESSURE_HARD_LIMIT, - AnomalyDetectorSettings.MAX_PRIMARY_SHARDS, - AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES, + AnomalyDetectorSettings.AD_INDEX_PRESSURE_SOFT_LIMIT, + AnomalyDetectorSettings.AD_INDEX_PRESSURE_HARD_LIMIT, + AnomalyDetectorSettings.AD_MAX_PRIMARY_SHARDS, + AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES, AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE, AnomalyDetectorSettings.BATCH_TASK_PIECE_INTERVAL_SECONDS, AnomalyDetectorSettings.MAX_OLD_AD_TASK_DOCS_PER_DETECTOR, AnomalyDetectorSettings.BATCH_TASK_PIECE_SIZE, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_CONCURRENCY, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_CONCURRENCY, - AnomalyDetectorSettings.ENTITY_COLD_START_QUEUE_CONCURRENCY, - AnomalyDetectorSettings.RESULT_WRITE_QUEUE_CONCURRENCY, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_BATCH_SIZE, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_BATCH_SIZE, - AnomalyDetectorSettings.RESULT_WRITE_QUEUE_BATCH_SIZE, - AnomalyDetectorSettings.DEDICATED_CACHE_SIZE, - AnomalyDetectorSettings.COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT, - AnomalyDetectorSettings.EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS, - AnomalyDetectorSettings.MAX_ENTITIES_PER_QUERY, - AnomalyDetectorSettings.PAGE_SIZE + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_CONCURRENCY, + AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_CONCURRENCY, + AnomalyDetectorSettings.AD_ENTITY_COLD_START_QUEUE_CONCURRENCY, + AnomalyDetectorSettings.AD_RESULT_WRITE_QUEUE_CONCURRENCY, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_BATCH_SIZE, + AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_BATCH_SIZE, + AnomalyDetectorSettings.AD_RESULT_WRITE_QUEUE_BATCH_SIZE, + AnomalyDetectorSettings.AD_DEDICATED_CACHE_SIZE, + AnomalyDetectorSettings.AD_COLD_ENTITY_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_READ_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_CHECKPOINT_WRITE_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_RESULT_WRITE_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_ENTITY_COLD_START_QUEUE_MAX_HEAP_PERCENT, + AnomalyDetectorSettings.AD_EXPECTED_COLD_ENTITY_EXECUTION_TIME_IN_MILLISECS, + AnomalyDetectorSettings.AD_MAX_ENTITIES_PER_QUERY, + AnomalyDetectorSettings.AD_PAGE_SIZE, + TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE, + TimeSeriesSettings.BACKOFF_MINUTES ) ) ); @@ -124,11 +127,11 @@ public void testAllOpenSearchSettingsReturned() { public void testAllLegacyOpenDistroSettingsFallback() { assertEquals( - AnomalyDetectorSettings.MAX_SINGLE_ENTITY_ANOMALY_DETECTORS.get(Settings.EMPTY), + AnomalyDetectorSettings.AD_MAX_SINGLE_ENTITY_ANOMALY_DETECTORS.get(Settings.EMPTY), LegacyOpenDistroAnomalyDetectorSettings.MAX_SINGLE_ENTITY_ANOMALY_DETECTORS.get(Settings.EMPTY) ); assertEquals( - AnomalyDetectorSettings.MAX_MULTI_ENTITY_ANOMALY_DETECTORS.get(Settings.EMPTY), + AnomalyDetectorSettings.AD_MAX_HC_ANOMALY_DETECTORS.get(Settings.EMPTY), LegacyOpenDistroAnomalyDetectorSettings.MAX_MULTI_ENTITY_ANOMALY_DETECTORS.get(Settings.EMPTY) ); assertEquals( @@ -136,7 +139,7 @@ public void testAllLegacyOpenDistroSettingsFallback() { LegacyOpenDistroAnomalyDetectorSettings.MAX_ANOMALY_FEATURES.get(Settings.EMPTY) ); assertEquals( - AnomalyDetectorSettings.REQUEST_TIMEOUT.get(Settings.EMPTY), + AnomalyDetectorSettings.AD_REQUEST_TIMEOUT.get(Settings.EMPTY), LegacyOpenDistroAnomalyDetectorSettings.REQUEST_TIMEOUT.get(Settings.EMPTY) ); assertEquals( @@ -152,23 +155,23 @@ public void testAllLegacyOpenDistroSettingsFallback() { LegacyOpenDistroAnomalyDetectorSettings.AD_RESULT_HISTORY_ROLLOVER_PERIOD.get(Settings.EMPTY) ); assertEquals( - AnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE.get(Settings.EMPTY), + TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE.get(Settings.EMPTY), LegacyOpenDistroAnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE.get(Settings.EMPTY) ); assertEquals( - AnomalyDetectorSettings.COOLDOWN_MINUTES.get(Settings.EMPTY), + AnomalyDetectorSettings.AD_COOLDOWN_MINUTES.get(Settings.EMPTY), LegacyOpenDistroAnomalyDetectorSettings.COOLDOWN_MINUTES.get(Settings.EMPTY) ); assertEquals( - AnomalyDetectorSettings.BACKOFF_MINUTES.get(Settings.EMPTY), + TimeSeriesSettings.BACKOFF_MINUTES.get(Settings.EMPTY), LegacyOpenDistroAnomalyDetectorSettings.BACKOFF_MINUTES.get(Settings.EMPTY) ); assertEquals( - AnomalyDetectorSettings.BACKOFF_INITIAL_DELAY.get(Settings.EMPTY), + AnomalyDetectorSettings.AD_BACKOFF_INITIAL_DELAY.get(Settings.EMPTY), LegacyOpenDistroAnomalyDetectorSettings.BACKOFF_INITIAL_DELAY.get(Settings.EMPTY) ); assertEquals( - AnomalyDetectorSettings.MAX_RETRY_FOR_BACKOFF.get(Settings.EMPTY), + AnomalyDetectorSettings.AD_MAX_RETRY_FOR_BACKOFF.get(Settings.EMPTY), LegacyOpenDistroAnomalyDetectorSettings.MAX_RETRY_FOR_BACKOFF.get(Settings.EMPTY) ); assertEquals( @@ -176,20 +179,20 @@ public void testAllLegacyOpenDistroSettingsFallback() { LegacyOpenDistroAnomalyDetectorSettings.AD_RESULT_HISTORY_RETENTION_PERIOD.get(Settings.EMPTY) ); assertEquals( - AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE.get(Settings.EMPTY), + AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE.get(Settings.EMPTY), LegacyOpenDistroAnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE.get(Settings.EMPTY) ); // MAX_ENTITIES_FOR_PREVIEW does not use legacy setting assertEquals(Integer.valueOf(5), AnomalyDetectorSettings.MAX_ENTITIES_FOR_PREVIEW.get(Settings.EMPTY)); // INDEX_PRESSURE_SOFT_LIMIT does not use legacy setting - assertEquals(Float.valueOf(0.6f), AnomalyDetectorSettings.INDEX_PRESSURE_SOFT_LIMIT.get(Settings.EMPTY)); + assertEquals(Float.valueOf(0.6f), AnomalyDetectorSettings.AD_INDEX_PRESSURE_SOFT_LIMIT.get(Settings.EMPTY)); assertEquals( - AnomalyDetectorSettings.MAX_PRIMARY_SHARDS.get(Settings.EMPTY), + AnomalyDetectorSettings.AD_MAX_PRIMARY_SHARDS.get(Settings.EMPTY), LegacyOpenDistroAnomalyDetectorSettings.MAX_PRIMARY_SHARDS.get(Settings.EMPTY) ); assertEquals( - AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES.get(Settings.EMPTY), - LegacyOpenDistroAnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES.get(Settings.EMPTY) + AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES.get(Settings.EMPTY), + LegacyOpenDistroAnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES.get(Settings.EMPTY) ); assertEquals( AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE.get(Settings.EMPTY), @@ -211,15 +214,15 @@ public void testAllLegacyOpenDistroSettingsFallback() { public void testSettingsGetValue() { Settings settings = Settings.builder().put("plugins.anomaly_detection.request_timeout", "42s").build(); - assertEquals(AnomalyDetectorSettings.REQUEST_TIMEOUT.get(settings), TimeValue.timeValueSeconds(42)); + assertEquals(AnomalyDetectorSettings.AD_REQUEST_TIMEOUT.get(settings), TimeValue.timeValueSeconds(42)); assertEquals(LegacyOpenDistroAnomalyDetectorSettings.REQUEST_TIMEOUT.get(settings), TimeValue.timeValueSeconds(10)); settings = Settings.builder().put("plugins.anomaly_detection.max_anomaly_detectors", 99).build(); - assertEquals(AnomalyDetectorSettings.MAX_SINGLE_ENTITY_ANOMALY_DETECTORS.get(settings), Integer.valueOf(99)); + assertEquals(AnomalyDetectorSettings.AD_MAX_SINGLE_ENTITY_ANOMALY_DETECTORS.get(settings), Integer.valueOf(99)); assertEquals(LegacyOpenDistroAnomalyDetectorSettings.MAX_SINGLE_ENTITY_ANOMALY_DETECTORS.get(settings), Integer.valueOf(1000)); settings = Settings.builder().put("plugins.anomaly_detection.max_multi_entity_anomaly_detectors", 98).build(); - assertEquals(AnomalyDetectorSettings.MAX_MULTI_ENTITY_ANOMALY_DETECTORS.get(settings), Integer.valueOf(98)); + assertEquals(AnomalyDetectorSettings.AD_MAX_HC_ANOMALY_DETECTORS.get(settings), Integer.valueOf(98)); assertEquals(LegacyOpenDistroAnomalyDetectorSettings.MAX_MULTI_ENTITY_ANOMALY_DETECTORS.get(settings), Integer.valueOf(10)); settings = Settings.builder().put("plugins.anomaly_detection.max_anomaly_features", 7).build(); @@ -253,39 +256,45 @@ public void testSettingsGetValue() { assertEquals(LegacyOpenDistroAnomalyDetectorSettings.AD_RESULT_HISTORY_RETENTION_PERIOD.get(settings), TimeValue.timeValueDays(30)); settings = Settings.builder().put("plugins.anomaly_detection.max_retry_for_unresponsive_node", 91).build(); - assertEquals(AnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE.get(settings), Integer.valueOf(91)); + assertEquals(AnomalyDetectorSettings.AD_MAX_RETRY_FOR_UNRESPONSIVE_NODE.get(settings), Integer.valueOf(91)); assertEquals(LegacyOpenDistroAnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE.get(settings), Integer.valueOf(5)); + settings = Settings.builder().put("plugins.timeseries.max_retry_for_unresponsive_node", 91).build(); + assertEquals(TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE.get(settings), Integer.valueOf(91)); + settings = Settings.builder().put("plugins.anomaly_detection.cooldown_minutes", TimeValue.timeValueMinutes(90)).build(); - assertEquals(AnomalyDetectorSettings.COOLDOWN_MINUTES.get(settings), TimeValue.timeValueMinutes(90)); + assertEquals(AnomalyDetectorSettings.AD_COOLDOWN_MINUTES.get(settings), TimeValue.timeValueMinutes(90)); assertEquals(LegacyOpenDistroAnomalyDetectorSettings.COOLDOWN_MINUTES.get(settings), TimeValue.timeValueMinutes(5)); settings = Settings.builder().put("plugins.anomaly_detection.backoff_minutes", TimeValue.timeValueMinutes(89)).build(); - assertEquals(AnomalyDetectorSettings.BACKOFF_MINUTES.get(settings), TimeValue.timeValueMinutes(89)); + assertEquals(AnomalyDetectorSettings.AD_BACKOFF_MINUTES.get(settings), TimeValue.timeValueMinutes(89)); assertEquals(LegacyOpenDistroAnomalyDetectorSettings.BACKOFF_MINUTES.get(settings), TimeValue.timeValueMinutes(15)); + settings = Settings.builder().put("plugins.timeseries.backoff_minutes", TimeValue.timeValueMinutes(89)).build(); + assertEquals(TimeSeriesSettings.BACKOFF_MINUTES.get(settings), TimeValue.timeValueMinutes(89)); + settings = Settings.builder().put("plugins.anomaly_detection.backoff_initial_delay", TimeValue.timeValueMillis(88)).build(); - assertEquals(AnomalyDetectorSettings.BACKOFF_INITIAL_DELAY.get(settings), TimeValue.timeValueMillis(88)); + assertEquals(AnomalyDetectorSettings.AD_BACKOFF_INITIAL_DELAY.get(settings), TimeValue.timeValueMillis(88)); assertEquals(LegacyOpenDistroAnomalyDetectorSettings.BACKOFF_INITIAL_DELAY.get(settings), TimeValue.timeValueMillis(1000)); settings = Settings.builder().put("plugins.anomaly_detection.max_retry_for_backoff", 87).build(); - assertEquals(AnomalyDetectorSettings.MAX_RETRY_FOR_BACKOFF.get(settings), Integer.valueOf(87)); + assertEquals(AnomalyDetectorSettings.AD_MAX_RETRY_FOR_BACKOFF.get(settings), Integer.valueOf(87)); assertEquals(LegacyOpenDistroAnomalyDetectorSettings.MAX_RETRY_FOR_BACKOFF.get(settings), Integer.valueOf(3)); settings = Settings.builder().put("plugins.anomaly_detection.max_retry_for_end_run_exception", 86).build(); - assertEquals(AnomalyDetectorSettings.MAX_RETRY_FOR_END_RUN_EXCEPTION.get(settings), Integer.valueOf(86)); + assertEquals(AnomalyDetectorSettings.AD_MAX_RETRY_FOR_END_RUN_EXCEPTION.get(settings), Integer.valueOf(86)); assertEquals(LegacyOpenDistroAnomalyDetectorSettings.MAX_RETRY_FOR_END_RUN_EXCEPTION.get(settings), Integer.valueOf(6)); settings = Settings.builder().put("plugins.anomaly_detection.filter_by_backend_roles", true).build(); - assertEquals(AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES.get(settings), Boolean.valueOf(true)); - assertEquals(LegacyOpenDistroAnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES.get(settings), Boolean.valueOf(false)); + assertEquals(AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES.get(settings), Boolean.valueOf(true)); + assertEquals(LegacyOpenDistroAnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES.get(settings), Boolean.valueOf(false)); settings = Settings.builder().put("plugins.anomaly_detection.model_max_size_percent", 0.3).build(); - assertEquals(AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE.get(settings), Double.valueOf(0.3)); + assertEquals(AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE.get(settings), Double.valueOf(0.3)); assertEquals(LegacyOpenDistroAnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE.get(settings), Double.valueOf(0.1)); settings = Settings.builder().put("plugins.anomaly_detection.max_entities_per_query", 83).build(); - assertEquals(AnomalyDetectorSettings.MAX_ENTITIES_PER_QUERY.get(settings), Integer.valueOf(83)); + assertEquals(AnomalyDetectorSettings.AD_MAX_ENTITIES_PER_QUERY.get(settings), Integer.valueOf(83)); assertEquals(LegacyOpenDistroAnomalyDetectorSettings.MAX_ENTITIES_PER_QUERY.get(settings), Integer.valueOf(1000)); settings = Settings.builder().put("plugins.anomaly_detection.max_entities_for_preview", 22).build(); @@ -293,11 +302,11 @@ public void testSettingsGetValue() { assertEquals(LegacyOpenDistroAnomalyDetectorSettings.MAX_ENTITIES_FOR_PREVIEW.get(settings), Integer.valueOf(30)); settings = Settings.builder().put("plugins.anomaly_detection.index_pressure_soft_limit", 81f).build(); - assertEquals(AnomalyDetectorSettings.INDEX_PRESSURE_SOFT_LIMIT.get(settings), Float.valueOf(81f)); + assertEquals(AnomalyDetectorSettings.AD_INDEX_PRESSURE_SOFT_LIMIT.get(settings), Float.valueOf(81f)); assertEquals(LegacyOpenDistroAnomalyDetectorSettings.INDEX_PRESSURE_SOFT_LIMIT.get(settings), Float.valueOf(0.8f)); settings = Settings.builder().put("plugins.anomaly_detection.max_primary_shards", 80).build(); - assertEquals(AnomalyDetectorSettings.MAX_PRIMARY_SHARDS.get(settings), Integer.valueOf(80)); + assertEquals(AnomalyDetectorSettings.AD_MAX_PRIMARY_SHARDS.get(settings), Integer.valueOf(80)); assertEquals(LegacyOpenDistroAnomalyDetectorSettings.MAX_PRIMARY_SHARDS.get(settings), Integer.valueOf(10)); settings = Settings.builder().put("plugins.anomaly_detection.max_cache_miss_handling_per_second", 79).build(); @@ -333,8 +342,10 @@ public void testSettingsGetValueWithLegacyFallback() { .put("opendistro.anomaly_detection.ad_result_history_max_docs", 8L) .put("opendistro.anomaly_detection.ad_result_history_retention_period", "9d") .put("opendistro.anomaly_detection.max_retry_for_unresponsive_node", 10) + .put("plugins.timeseries.max_retry_for_unresponsive_node", 10) .put("opendistro.anomaly_detection.cooldown_minutes", "11m") .put("opendistro.anomaly_detection.backoff_minutes", "12m") + .put("plugins.timeseries.backoff_minutes", "12m") .put("opendistro.anomaly_detection.backoff_initial_delay", "13ms") // .put("opendistro.anomaly_detection.max_retry_for_backoff", 14) .put("opendistro.anomaly_detection.max_retry_for_end_run_exception", 15) @@ -350,29 +361,31 @@ public void testSettingsGetValueWithLegacyFallback() { .put("opendistro.anomaly_detection.batch_task_piece_interval_seconds", 26) .build(); - assertEquals(AnomalyDetectorSettings.MAX_SINGLE_ENTITY_ANOMALY_DETECTORS.get(settings), Integer.valueOf(1)); - assertEquals(AnomalyDetectorSettings.MAX_MULTI_ENTITY_ANOMALY_DETECTORS.get(settings), Integer.valueOf(2)); + assertEquals(AnomalyDetectorSettings.AD_MAX_SINGLE_ENTITY_ANOMALY_DETECTORS.get(settings), Integer.valueOf(1)); + assertEquals(AnomalyDetectorSettings.AD_MAX_HC_ANOMALY_DETECTORS.get(settings), Integer.valueOf(2)); assertEquals(AnomalyDetectorSettings.MAX_ANOMALY_FEATURES.get(settings), Integer.valueOf(3)); - assertEquals(AnomalyDetectorSettings.REQUEST_TIMEOUT.get(settings), TimeValue.timeValueSeconds(4)); + assertEquals(AnomalyDetectorSettings.AD_REQUEST_TIMEOUT.get(settings), TimeValue.timeValueSeconds(4)); assertEquals(AnomalyDetectorSettings.DETECTION_INTERVAL.get(settings), TimeValue.timeValueMinutes(5)); assertEquals(AnomalyDetectorSettings.DETECTION_WINDOW_DELAY.get(settings), TimeValue.timeValueMinutes(6)); assertEquals(AnomalyDetectorSettings.AD_RESULT_HISTORY_ROLLOVER_PERIOD.get(settings), TimeValue.timeValueHours(7)); // AD_RESULT_HISTORY_MAX_DOCS is removed in the new release assertEquals(LegacyOpenDistroAnomalyDetectorSettings.AD_RESULT_HISTORY_MAX_DOCS.get(settings), Long.valueOf(8L)); assertEquals(AnomalyDetectorSettings.AD_RESULT_HISTORY_RETENTION_PERIOD.get(settings), TimeValue.timeValueDays(9)); - assertEquals(AnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE.get(settings), Integer.valueOf(10)); - assertEquals(AnomalyDetectorSettings.COOLDOWN_MINUTES.get(settings), TimeValue.timeValueMinutes(11)); - assertEquals(AnomalyDetectorSettings.BACKOFF_MINUTES.get(settings), TimeValue.timeValueMinutes(12)); - assertEquals(AnomalyDetectorSettings.BACKOFF_INITIAL_DELAY.get(settings), TimeValue.timeValueMillis(13)); - assertEquals(AnomalyDetectorSettings.MAX_RETRY_FOR_BACKOFF.get(settings), Integer.valueOf(14)); - assertEquals(AnomalyDetectorSettings.MAX_RETRY_FOR_END_RUN_EXCEPTION.get(settings), Integer.valueOf(15)); - assertEquals(AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES.get(settings), Boolean.valueOf(true)); - assertEquals(AnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE.get(settings), Double.valueOf(0.6D)); + assertEquals(AnomalyDetectorSettings.AD_MAX_RETRY_FOR_UNRESPONSIVE_NODE.get(settings), Integer.valueOf(10)); + assertEquals(TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE.get(settings), Integer.valueOf(10)); + assertEquals(AnomalyDetectorSettings.AD_COOLDOWN_MINUTES.get(settings), TimeValue.timeValueMinutes(11)); + assertEquals(AnomalyDetectorSettings.AD_BACKOFF_MINUTES.get(settings), TimeValue.timeValueMinutes(12)); + assertEquals(TimeSeriesSettings.BACKOFF_MINUTES.get(settings), TimeValue.timeValueMinutes(12)); + assertEquals(AnomalyDetectorSettings.AD_BACKOFF_INITIAL_DELAY.get(settings), TimeValue.timeValueMillis(13)); + assertEquals(AnomalyDetectorSettings.AD_MAX_RETRY_FOR_BACKOFF.get(settings), Integer.valueOf(14)); + assertEquals(AnomalyDetectorSettings.AD_MAX_RETRY_FOR_END_RUN_EXCEPTION.get(settings), Integer.valueOf(15)); + assertEquals(AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES.get(settings), Boolean.valueOf(true)); + assertEquals(AnomalyDetectorSettings.AD_MODEL_MAX_SIZE_PERCENTAGE.get(settings), Double.valueOf(0.6D)); // MAX_ENTITIES_FOR_PREVIEW uses default instead of legacy fallback assertEquals(AnomalyDetectorSettings.MAX_ENTITIES_FOR_PREVIEW.get(settings), Integer.valueOf(5)); // INDEX_PRESSURE_SOFT_LIMIT uses default instead of legacy fallback - assertEquals(AnomalyDetectorSettings.INDEX_PRESSURE_SOFT_LIMIT.get(settings), Float.valueOf(0.6F)); - assertEquals(AnomalyDetectorSettings.MAX_PRIMARY_SHARDS.get(settings), Integer.valueOf(21)); + assertEquals(AnomalyDetectorSettings.AD_INDEX_PRESSURE_SOFT_LIMIT.get(settings), Float.valueOf(0.6F)); + assertEquals(AnomalyDetectorSettings.AD_MAX_PRIMARY_SHARDS.get(settings), Integer.valueOf(21)); // MAX_CACHE_MISS_HANDLING_PER_SECOND is removed in the new release assertEquals(LegacyOpenDistroAnomalyDetectorSettings.MAX_CACHE_MISS_HANDLING_PER_SECOND.get(settings), Integer.valueOf(22)); assertEquals(AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE.get(settings), Integer.valueOf(23)); @@ -399,7 +412,7 @@ public void testSettingsGetValueWithLegacyFallback() { LegacyOpenDistroAnomalyDetectorSettings.AD_RESULT_HISTORY_RETENTION_PERIOD, LegacyOpenDistroAnomalyDetectorSettings.MODEL_MAX_SIZE_PERCENTAGE, LegacyOpenDistroAnomalyDetectorSettings.MAX_PRIMARY_SHARDS, - LegacyOpenDistroAnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES, + LegacyOpenDistroAnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES, LegacyOpenDistroAnomalyDetectorSettings.MAX_CACHE_MISS_HANDLING_PER_SECOND, LegacyOpenDistroAnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE, LegacyOpenDistroAnomalyDetectorSettings.BATCH_TASK_PIECE_INTERVAL_SECONDS, diff --git a/src/test/java/org/opensearch/ad/stats/ADStatsTests.java b/src/test/java/org/opensearch/ad/stats/ADStatsTests.java index ee736e0b7..02be33aab 100644 --- a/src/test/java/org/opensearch/ad/stats/ADStatsTests.java +++ b/src/test/java/org/opensearch/ad/stats/ADStatsTests.java @@ -14,7 +14,7 @@ import static org.mockito.ArgumentMatchers.anyString; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_MODEL_SIZE_PER_NODE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_MAX_MODEL_SIZE_PER_NODE; import java.time.Clock; import java.util.ArrayList; @@ -44,6 +44,9 @@ import org.opensearch.common.settings.ClusterSettings; import org.opensearch.common.settings.Settings; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.stats.StatNames; + +import com.amazon.randomcutforest.RandomCutForest; import com.amazon.randomcutforest.RandomCutForest; @@ -99,7 +102,6 @@ public void setup() { IndexUtils indexUtils = mock(IndexUtils.class); when(indexUtils.getIndexHealthStatus(anyString())).thenReturn("yellow"); - when(indexUtils.getNumberOfDocumentsInIndex(anyString())).thenReturn(100L); clusterStatName1 = "clusterStat1"; clusterStatName2 = "clusterStat2"; @@ -107,11 +109,11 @@ public void setup() { nodeStatName1 = "nodeStat1"; nodeStatName2 = "nodeStat2"; - Settings settings = Settings.builder().put(MAX_MODEL_SIZE_PER_NODE.getKey(), 10).build(); + Settings settings = Settings.builder().put(AD_MAX_MODEL_SIZE_PER_NODE.getKey(), 10).build(); ClusterService clusterService = mock(ClusterService.class); ClusterSettings clusterSettings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(MAX_MODEL_SIZE_PER_NODE))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AD_MAX_MODEL_SIZE_PER_NODE))) ); when(clusterService.getClusterSettings()).thenReturn(clusterSettings); diff --git a/src/test/java/org/opensearch/ad/stats/suppliers/ModelsOnNodeSupplierTests.java b/src/test/java/org/opensearch/ad/stats/suppliers/ModelsOnNodeSupplierTests.java index fd56ae427..21a9e4aff 100644 --- a/src/test/java/org/opensearch/ad/stats/suppliers/ModelsOnNodeSupplierTests.java +++ b/src/test/java/org/opensearch/ad/stats/suppliers/ModelsOnNodeSupplierTests.java @@ -13,7 +13,7 @@ import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_MODEL_SIZE_PER_NODE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_MAX_MODEL_SIZE_PER_NODE; import static org.opensearch.ad.stats.suppliers.ModelsOnNodeSupplier.MODEL_STATE_STAT_KEYS; import java.time.Clock; @@ -90,11 +90,11 @@ public void setup() { @Test public void testGet() { - Settings settings = Settings.builder().put(MAX_MODEL_SIZE_PER_NODE.getKey(), 10).build(); + Settings settings = Settings.builder().put(AD_MAX_MODEL_SIZE_PER_NODE.getKey(), 10).build(); ClusterService clusterService = mock(ClusterService.class); ClusterSettings clusterSettings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(MAX_MODEL_SIZE_PER_NODE))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AD_MAX_MODEL_SIZE_PER_NODE))) ); when(clusterService.getClusterSettings()).thenReturn(clusterSettings); diff --git a/src/test/java/org/opensearch/ad/task/ADTaskCacheManagerTests.java b/src/test/java/org/opensearch/ad/task/ADTaskCacheManagerTests.java index 6f5111566..ad14b49c4 100644 --- a/src/test/java/org/opensearch/ad/task/ADTaskCacheManagerTests.java +++ b/src/test/java/org/opensearch/ad/task/ADTaskCacheManagerTests.java @@ -19,9 +19,9 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.MemoryTracker.Origin.HISTORICAL_SINGLE_ENTITY_DETECTOR; -import static org.opensearch.ad.constant.CommonErrorMessages.DETECTOR_IS_RUNNING; +import static org.opensearch.ad.constant.ADCommonMessages.DETECTOR_IS_RUNNING; import static org.opensearch.ad.task.ADTaskCacheManager.TASK_RETRY_LIMIT; +import static org.opensearch.timeseries.MemoryTracker.Origin.HISTORICAL_SINGLE_ENTITY_DETECTOR; import java.io.IOException; import java.time.Instant; @@ -33,12 +33,7 @@ import org.junit.After; import org.junit.Before; -import org.opensearch.ad.MemoryTracker; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.common.exception.DuplicateTaskException; -import org.opensearch.ad.common.exception.LimitExceededException; import org.opensearch.ad.model.ADTask; -import org.opensearch.ad.model.ADTaskState; import org.opensearch.ad.model.ADTaskType; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.settings.AnomalyDetectorSettings; @@ -46,6 +41,13 @@ import org.opensearch.common.settings.ClusterSettings; import org.opensearch.common.settings.Settings; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.MemoryTracker; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.DuplicateTaskException; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.model.TaskState; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.task.RealtimeTaskCache; import com.google.common.collect.ImmutableList; @@ -63,7 +65,7 @@ public void setUp() throws Exception { settings = Settings .builder() .put(AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE.getKey(), 2) - .put(AnomalyDetectorSettings.MAX_CACHED_DELETED_TASKS.getKey(), 100) + .put(TimeSeriesSettings.MAX_CACHED_DELETED_TASKS.getKey(), 100) .build(); clusterService = mock(ClusterService.class); @@ -72,7 +74,7 @@ public void setUp() throws Exception { Collections .unmodifiableSet( new HashSet<>( - Arrays.asList(AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE, AnomalyDetectorSettings.MAX_CACHED_DELETED_TASKS) + Arrays.asList(AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE, TimeSeriesSettings.MAX_CACHED_DELETED_TASKS) ) ) ); @@ -94,7 +96,7 @@ public void testPutTask() throws IOException { adTaskCacheManager.add(adTask); assertEquals(1, adTaskCacheManager.size()); assertTrue(adTaskCacheManager.contains(adTask.getTaskId())); - assertTrue(adTaskCacheManager.containsTaskOfDetector(adTask.getDetectorId())); + assertTrue(adTaskCacheManager.containsTaskOfDetector(adTask.getConfigId())); assertNotNull(adTaskCacheManager.getTRcfModel(adTask.getTaskId())); assertNotNull(adTaskCacheManager.getShingle(adTask.getTaskId())); assertFalse(adTaskCacheManager.isThresholdModelTrained(adTask.getTaskId())); @@ -113,10 +115,10 @@ public void testPutDuplicateTask() throws IOException { ADTask adTask2 = TestHelpers .randomAdTask( randomAlphaOfLength(5), - ADTaskState.INIT, + TaskState.INIT, adTask1.getExecutionEndTime(), adTask1.getStoppedBy(), - adTask1.getDetectorId(), + adTask1.getConfigId(), adTask1.getDetector(), ADTaskType.HISTORICAL_SINGLE_ENTITY ); @@ -137,26 +139,26 @@ public void testPutMultipleEntityTasks() throws IOException { ADTask adTask1 = TestHelpers .randomAdTask( randomAlphaOfLength(5), - ADTaskState.CREATED, + TaskState.CREATED, Instant.now(), null, - detector.getDetectorId(), + detector.getId(), detector, ADTaskType.HISTORICAL_HC_ENTITY ); ADTask adTask2 = TestHelpers .randomAdTask( randomAlphaOfLength(5), - ADTaskState.CREATED, + TaskState.CREATED, Instant.now(), null, - detector.getDetectorId(), + detector.getId(), detector, ADTaskType.HISTORICAL_HC_ENTITY ); adTaskCacheManager.add(adTask1); adTaskCacheManager.add(adTask2); - List tasks = adTaskCacheManager.getTasksOfDetector(detector.getDetectorId()); + List tasks = adTaskCacheManager.getTasksOfDetector(detector.getId()); assertEquals(2, tasks.size()); } @@ -223,8 +225,8 @@ public void testCancelByDetectorId() throws IOException { when(memoryTracker.canAllocateReserved(anyLong())).thenReturn(true); ADTask adTask = TestHelpers.randomAdTask(); adTaskCacheManager.add(adTask); - String detectorId = adTask.getDetectorId(); - String detectorTaskId = adTask.getDetectorId(); + String detectorId = adTask.getConfigId(); + String detectorTaskId = adTask.getConfigId(); String reason = randomAlphaOfLength(10); String userName = randomAlphaOfLength(5); ADTaskCancellationState state = adTaskCacheManager.cancelByDetectorId(detectorId, detectorTaskId, reason, userName); @@ -310,7 +312,7 @@ public void testPushBackEntity() throws IOException { public void testRealtimeTaskCache() { String detectorId1 = randomAlphaOfLength(10); - String newState = ADTaskState.INIT.name(); + String newState = TaskState.INIT.name(); Float newInitProgress = 0.0f; String newError = randomAlphaOfLength(5); assertTrue(adTaskCacheManager.isRealtimeTaskChangeNeeded(detectorId1, newState, newInitProgress, newError)); @@ -328,7 +330,7 @@ public void testRealtimeTaskCache() { adTaskCacheManager.updateRealtimeTaskCache(detectorId2, newState, newInitProgress, newError); assertEquals(2, adTaskCacheManager.getDetectorIdsInRealtimeTaskCache().length); - newState = ADTaskState.RUNNING.name(); + newState = TaskState.RUNNING.name(); newInitProgress = 1.0f; newError = "test error"; assertTrue(adTaskCacheManager.isRealtimeTaskChangeNeeded(detectorId1, newState, newInitProgress, newError)); @@ -349,12 +351,12 @@ public void testUpdateRealtimeTaskCache() { String detectorId = randomAlphaOfLength(5); adTaskCacheManager.initRealtimeTaskCache(detectorId, 60_000); adTaskCacheManager.updateRealtimeTaskCache(detectorId, null, null, null); - ADRealtimeTaskCache realtimeTaskCache = adTaskCacheManager.getRealtimeTaskCache(detectorId); + RealtimeTaskCache realtimeTaskCache = adTaskCacheManager.getRealtimeTaskCache(detectorId); assertNull(realtimeTaskCache.getState()); assertNull(realtimeTaskCache.getError()); assertNull(realtimeTaskCache.getInitProgress()); - String state = ADTaskState.RUNNING.name(); + String state = TaskState.RUNNING.name(); Float initProgress = 0.1f; String error = randomAlphaOfLength(5); adTaskCacheManager.updateRealtimeTaskCache(detectorId, state, initProgress, error); @@ -363,7 +365,7 @@ public void testUpdateRealtimeTaskCache() { assertEquals(error, realtimeTaskCache.getError()); assertEquals(initProgress, realtimeTaskCache.getInitProgress()); - state = ADTaskState.STOPPED.name(); + state = TaskState.STOPPED.name(); adTaskCacheManager.updateRealtimeTaskCache(detectorId, state, initProgress, error); realtimeTaskCache = adTaskCacheManager.getRealtimeTaskCache(detectorId); assertNull(realtimeTaskCache); @@ -379,10 +381,10 @@ public void testGetAndDecreaseEntityTaskLanes() throws IOException { public void testDeletedTask() { String taskId = randomAlphaOfLength(10); - adTaskCacheManager.addDeletedDetectorTask(taskId); - assertTrue(adTaskCacheManager.hasDeletedDetectorTask()); - assertEquals(taskId, adTaskCacheManager.pollDeletedDetectorTask()); - assertFalse(adTaskCacheManager.hasDeletedDetectorTask()); + adTaskCacheManager.addDeletedTask(taskId); + assertTrue(adTaskCacheManager.hasDeletedTask()); + assertEquals(taskId, adTaskCacheManager.pollDeletedTask()); + assertFalse(adTaskCacheManager.hasDeletedTask()); } public void testAcquireTaskUpdatingSemaphore() throws IOException, InterruptedException { @@ -430,11 +432,11 @@ private List addHCDetectorCache() throws IOException { true, ImmutableList.of(randomAlphaOfLength(5)) ); - String detectorId = detector.getDetectorId(); + String detectorId = detector.getId(); ADTask adDetectorTask = TestHelpers .randomAdTask( randomAlphaOfLength(5), - ADTaskState.CREATED, + TaskState.CREATED, Instant.now(), null, detectorId, @@ -444,7 +446,7 @@ private List addHCDetectorCache() throws IOException { ADTask adEntityTask = TestHelpers .randomAdTask( randomAlphaOfLength(5), - ADTaskState.CREATED, + TaskState.CREATED, Instant.now(), null, detectorId, @@ -527,7 +529,7 @@ public void testTaskLanes() throws IOException { public void testRefreshRealtimeJobRunTime() throws InterruptedException { String detectorId = randomAlphaOfLength(5); adTaskCacheManager.initRealtimeTaskCache(detectorId, 1_000); - ADRealtimeTaskCache realtimeTaskCache = adTaskCacheManager.getRealtimeTaskCache(detectorId); + RealtimeTaskCache realtimeTaskCache = adTaskCacheManager.getRealtimeTaskCache(detectorId); assertFalse(realtimeTaskCache.expired()); Thread.sleep(3_000); assertTrue(realtimeTaskCache.expired()); @@ -537,10 +539,10 @@ public void testRefreshRealtimeJobRunTime() throws InterruptedException { public void testAddDeletedDetector() { String detectorId = randomAlphaOfLength(5); - adTaskCacheManager.addDeletedDetector(detectorId); - String polledDetectorId = adTaskCacheManager.pollDeletedDetector(); + adTaskCacheManager.addDeletedConfig(detectorId); + String polledDetectorId = adTaskCacheManager.pollDeletedConfig(); assertEquals(detectorId, polledDetectorId); - assertNull(adTaskCacheManager.pollDeletedDetector()); + assertNull(adTaskCacheManager.pollDeletedConfig()); } public void testAddPendingEntitiesWithEmptyList() throws IOException { @@ -621,11 +623,11 @@ public void testADHCBatchTaskRunStateCacheWithCancel() { ADHCBatchTaskRunState state = adTaskCacheManager.getOrCreateHCDetectorTaskStateCache(detectorId, detectorTaskId); assertTrue(adTaskCacheManager.detectorTaskStateExists(detectorId, detectorTaskId)); - assertEquals(ADTaskState.INIT.name(), state.getDetectorTaskState()); + assertEquals(TaskState.INIT.name(), state.getDetectorTaskState()); assertFalse(state.expired()); - state.setDetectorTaskState(ADTaskState.RUNNING.name()); - assertEquals(ADTaskState.RUNNING.name(), adTaskCacheManager.getDetectorTaskState(detectorId, detectorTaskId)); + state.setDetectorTaskState(TaskState.RUNNING.name()); + assertEquals(TaskState.RUNNING.name(), adTaskCacheManager.getDetectorTaskState(detectorId, detectorTaskId)); String cancelReason = randomAlphaOfLength(5); String cancelledBy = randomAlphaOfLength(5); @@ -647,7 +649,7 @@ public void testADHCBatchTaskRunStateCacheWithCancel() { public void testUpdateDetectorTaskState() { String detectorId = randomAlphaOfLength(5); String detectorTaskId = randomAlphaOfLength(5); - String newState = ADTaskState.RUNNING.name(); + String newState = TaskState.RUNNING.name(); adTaskCacheManager.updateDetectorTaskState(detectorId, detectorTaskId, newState); assertEquals(newState, adTaskCacheManager.getDetectorTaskState(detectorId, detectorTaskId)); diff --git a/src/test/java/org/opensearch/ad/task/ADTaskManagerTests.java b/src/test/java/org/opensearch/ad/task/ADTaskManagerTests.java index 08645a279..f9df58903 100644 --- a/src/test/java/org/opensearch/ad/task/ADTaskManagerTests.java +++ b/src/test/java/org/opensearch/ad/task/ADTaskManagerTests.java @@ -28,25 +28,25 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.TestHelpers.randomAdTask; -import static org.opensearch.ad.TestHelpers.randomAnomalyDetector; -import static org.opensearch.ad.TestHelpers.randomDetectionDateRange; -import static org.opensearch.ad.TestHelpers.randomDetector; -import static org.opensearch.ad.TestHelpers.randomFeature; -import static org.opensearch.ad.TestHelpers.randomIntervalSchedule; -import static org.opensearch.ad.TestHelpers.randomIntervalTimeConfiguration; -import static org.opensearch.ad.TestHelpers.randomUser; -import static org.opensearch.ad.constant.CommonErrorMessages.CREATE_INDEX_NOT_ACKNOWLEDGED; -import static org.opensearch.ad.constant.CommonName.ANOMALY_RESULT_INDEX_ALIAS; -import static org.opensearch.ad.constant.CommonName.DETECTION_STATE_INDEX; -import static org.opensearch.ad.model.Entity.createSingleAttributeEntity; +import static org.opensearch.ad.constant.ADCommonName.ANOMALY_RESULT_INDEX_ALIAS; +import static org.opensearch.ad.constant.ADCommonName.DETECTION_STATE_INDEX; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_REQUEST_TIMEOUT; import static org.opensearch.ad.settings.AnomalyDetectorSettings.BATCH_TASK_PIECE_INTERVAL_SECONDS; import static org.opensearch.ad.settings.AnomalyDetectorSettings.DELETE_AD_RESULT_WHEN_DELETE_DETECTOR; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_OLD_AD_TASK_DOCS_PER_DETECTOR; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_RUNNING_ENTITIES_PER_DETECTOR_FOR_HISTORICAL_ANALYSIS; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.REQUEST_TIMEOUT; import static org.opensearch.index.seqno.SequenceNumbers.UNASSIGNED_SEQ_NO; +import static org.opensearch.timeseries.TestHelpers.randomAdTask; +import static org.opensearch.timeseries.TestHelpers.randomAnomalyDetector; +import static org.opensearch.timeseries.TestHelpers.randomDetectionDateRange; +import static org.opensearch.timeseries.TestHelpers.randomDetector; +import static org.opensearch.timeseries.TestHelpers.randomFeature; +import static org.opensearch.timeseries.TestHelpers.randomIntervalSchedule; +import static org.opensearch.timeseries.TestHelpers.randomIntervalTimeConfiguration; +import static org.opensearch.timeseries.TestHelpers.randomUser; +import static org.opensearch.timeseries.constant.CommonMessages.CREATE_INDEX_NOT_ACKNOWLEDGED; +import static org.opensearch.timeseries.model.Entity.createSingleAttributeEntity; import java.io.IOException; import java.time.Instant; @@ -80,30 +80,21 @@ import org.opensearch.action.search.ShardSearchFailure; import org.opensearch.action.update.UpdateResponse; import org.opensearch.ad.ADUnitTestCase; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.cluster.HashRing; -import org.opensearch.ad.common.exception.DuplicateTaskException; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.mock.model.MockSimpleLog; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.ADTaskAction; import org.opensearch.ad.model.ADTaskProfile; -import org.opensearch.ad.model.ADTaskState; import org.opensearch.ad.model.ADTaskType; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.model.DetectionDateRange; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.rest.handler.AnomalyDetectorFunction; import org.opensearch.ad.rest.handler.IndexAnomalyDetectorJobActionHandler; import org.opensearch.ad.stats.InternalStatNames; import org.opensearch.ad.transport.ADStatsNodeResponse; import org.opensearch.ad.transport.ADStatsNodesResponse; import org.opensearch.ad.transport.ADTaskProfileNodeResponse; import org.opensearch.ad.transport.ADTaskProfileResponse; -import org.opensearch.ad.transport.AnomalyDetectorJobResponse; import org.opensearch.ad.transport.ForwardADTaskRequest; -import org.opensearch.ad.util.DiscoveryNodeFilterer; import org.opensearch.client.Client; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.node.DiscoveryNode; @@ -128,9 +119,18 @@ import org.opensearch.search.SearchHits; import org.opensearch.search.aggregations.InternalAggregations; import org.opensearch.search.internal.InternalSearchResponse; -import org.opensearch.telemetry.tracing.noop.NoopTracer; import org.opensearch.threadpool.ThreadPool; -import org.opensearch.transport.Transport; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.DuplicateTaskException; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.model.TaskState; +import org.opensearch.timeseries.task.RealtimeTaskCache; +import org.opensearch.timeseries.transport.JobResponse; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; import org.opensearch.transport.TransportResponseHandler; import org.opensearch.transport.TransportService; @@ -146,7 +146,7 @@ public class ADTaskManagerTests extends ADUnitTestCase { private ClusterService clusterService; private ClusterSettings clusterSettings; private DiscoveryNodeFilterer nodeFilter; - private AnomalyDetectionIndices detectionIndices; + private ADIndexManagement detectionIndices; private ADTaskCacheManager adTaskCacheManager; private HashRing hashRing; private ThreadContext.StoredContext context; @@ -156,8 +156,8 @@ public class ADTaskManagerTests extends ADUnitTestCase { private ThreadPool threadPool; private IndexAnomalyDetectorJobActionHandler indexAnomalyDetectorJobActionHandler; - private DetectionDateRange detectionDateRange; - private ActionListener listener; + private DateRange detectionDateRange; + private ActionListener listener; private DiscoveryNode node1; private DiscoveryNode node2; @@ -200,7 +200,7 @@ public class ADTaskManagerTests extends ADUnitTestCase { + ",\"parent_task_id\":\"a1civ3sBwF58XZxvKrko\",\"worker_node\":\"DL5uOJV3TjOOAyh5hJXrCA\",\"current_piece\"" + ":1630999260000,\"execution_end_time\":1630999442814}}"; @Captor - ArgumentCaptor> remoteResponseHandler; + ArgumentCaptor> remoteResponseHandler; @Override public void setUp() throws Exception { @@ -208,20 +208,20 @@ public void setUp() throws Exception { Instant now = Instant.now(); Instant startTime = now.minus(10, ChronoUnit.DAYS); Instant endTime = now.minus(1, ChronoUnit.DAYS); - detectionDateRange = new DetectionDateRange(startTime, endTime); + detectionDateRange = new DateRange(startTime, endTime); settings = Settings .builder() .put(MAX_OLD_AD_TASK_DOCS_PER_DETECTOR.getKey(), 2) .put(BATCH_TASK_PIECE_INTERVAL_SECONDS.getKey(), 1) - .put(REQUEST_TIMEOUT.getKey(), TimeValue.timeValueSeconds(10)) + .put(AD_REQUEST_TIMEOUT.getKey(), TimeValue.timeValueSeconds(10)) .build(); clusterSettings = clusterSetting( settings, MAX_OLD_AD_TASK_DOCS_PER_DETECTOR, BATCH_TASK_PIECE_INTERVAL_SECONDS, - REQUEST_TIMEOUT, + AD_REQUEST_TIMEOUT, DELETE_AD_RESULT_WHEN_DELETE_DETECTOR, MAX_BATCH_TASK_PER_NODE, MAX_RUNNING_ENTITIES_PER_DETECTOR_FOR_HISTORICAL_ANALYSIS @@ -232,19 +232,10 @@ public void setUp() throws Exception { client = mock(Client.class); nodeFilter = mock(DiscoveryNodeFilterer.class); - detectionIndices = mock(AnomalyDetectionIndices.class); + detectionIndices = mock(ADIndexManagement.class); adTaskCacheManager = mock(ADTaskCacheManager.class); hashRing = mock(HashRing.class); - transportService = new TransportService( - Settings.EMPTY, - mock(Transport.class), - null, - TransportService.NOOP_TRANSPORT_INTERCEPTOR, - x -> null, - null, - Collections.emptySet(), - NoopTracer.INSTANCE - ); + transportService = mock(TransportService.class); threadPool = mock(ThreadPool.class); threadContext = new ThreadContext(settings); when(threadPool.getThreadContext()).thenReturn(threadContext); @@ -264,9 +255,9 @@ public void setUp() throws Exception { ) ); - listener = spy(new ActionListener() { + listener = spy(new ActionListener() { @Override - public void onResponse(AnomalyDetectorJobResponse bulkItemResponses) {} + public void onResponse(JobResponse bulkItemResponses) {} @Override public void onFailure(Exception e) {} @@ -301,8 +292,8 @@ private void setupGetDetector(AnomalyDetector detector) { .onResponse( new GetResponse( new GetResult( - AnomalyDetector.ANOMALY_DETECTORS_INDEX, - detector.getDetectorId(), + CommonName.CONFIG_INDEX, + detector.getId(), UNASSIGNED_SEQ_NO, 0, -1, @@ -338,8 +329,8 @@ public void testCreateTaskIndexNotAcknowledged() throws IOException { ActionListener listener = invocation.getArgument(0); listener.onResponse(new CreateIndexResponse(false, false, ANOMALY_RESULT_INDEX_ALIAS)); return null; - }).when(detectionIndices).initDetectionStateIndex(any()); - doReturn(false).when(detectionIndices).doesDetectorStateIndexExist(); + }).when(detectionIndices).initStateIndex(any()); + doReturn(false).when(detectionIndices).doesStateIndexExist(); AnomalyDetector detector = randomDetector(ImmutableList.of(randomFeature(true)), randomAlphaOfLength(5), 1, randomAlphaOfLength(5)); setupGetDetector(detector); @@ -354,8 +345,8 @@ public void testCreateTaskIndexWithResourceAlreadyExistsException() throws IOExc ActionListener listener = invocation.getArgument(0); listener.onFailure(new ResourceAlreadyExistsException("index created")); return null; - }).when(detectionIndices).initDetectionStateIndex(any()); - doReturn(false).when(detectionIndices).doesDetectorStateIndexExist(); + }).when(detectionIndices).initStateIndex(any()); + doReturn(false).when(detectionIndices).doesStateIndexExist(); AnomalyDetector detector = randomDetector(ImmutableList.of(randomFeature(true)), randomAlphaOfLength(5), 1, randomAlphaOfLength(5)); setupGetDetector(detector); @@ -369,8 +360,8 @@ public void testCreateTaskIndexWithException() throws IOException { ActionListener listener = invocation.getArgument(0); listener.onFailure(new RuntimeException(error)); return null; - }).when(detectionIndices).initDetectionStateIndex(any()); - doReturn(false).when(detectionIndices).doesDetectorStateIndexExist(); + }).when(detectionIndices).initStateIndex(any()); + doReturn(false).when(detectionIndices).doesStateIndexExist(); AnomalyDetector detector = randomDetector(ImmutableList.of(randomFeature(true)), randomAlphaOfLength(5), 1, randomAlphaOfLength(5)); setupGetDetector(detector); @@ -390,7 +381,7 @@ public void testStartDetectorWithNoEnabledFeature() throws IOException { adTaskManager .startDetector( - detector.getDetectorId(), + detector.getId(), detectionDateRange, indexAnomalyDetectorJobActionHandler, randomUser(), @@ -409,7 +400,7 @@ public void testStartDetectorForHistoricalAnalysis() throws IOException { adTaskManager .startDetector( - detector.getDetectorId(), + detector.getId(), detectionDateRange, indexAnomalyDetectorJobActionHandler, randomUser(), @@ -460,7 +451,7 @@ private void setupTaskSlots(int node1UsedTaskSlots, int node1AssignedTaskSLots, public void testCheckTaskSlotsWithNoAvailableTaskSlots() throws IOException { ADTask adTask = randomAdTask( randomAlphaOfLength(5), - ADTaskState.INIT, + TaskState.INIT, Instant.now(), randomAlphaOfLength(5), TestHelpers.randomAnomalyDetectorUsingCategoryFields(randomAlphaOfLength(5), ImmutableList.of(randomAlphaOfLength(5))) @@ -485,7 +476,7 @@ private void setupSearchTopEntities(int entitySize) { public void testCheckTaskSlotsWithAvailableTaskSlotsForHC() throws IOException { ADTask adTask = randomAdTask( randomAlphaOfLength(5), - ADTaskState.INIT, + TaskState.INIT, Instant.now(), randomAlphaOfLength(5), TestHelpers.randomAnomalyDetectorUsingCategoryFields(randomAlphaOfLength(5), ImmutableList.of(randomAlphaOfLength(5))) @@ -504,7 +495,7 @@ public void testCheckTaskSlotsWithAvailableTaskSlotsForHC() throws IOException { public void testCheckTaskSlotsWithAvailableTaskSlotsForSingleEntityDetector() throws IOException { ADTask adTask = randomAdTask( randomAlphaOfLength(5), - ADTaskState.INIT, + TaskState.INIT, Instant.now(), randomAlphaOfLength(5), TestHelpers.randomAnomalyDetectorUsingCategoryFields(randomAlphaOfLength(5), ImmutableList.of()) @@ -522,7 +513,7 @@ public void testCheckTaskSlotsWithAvailableTaskSlotsForSingleEntityDetector() th public void testCheckTaskSlotsWithAvailableTaskSlotsAndNoEntity() throws IOException { ADTask adTask = randomAdTask( randomAlphaOfLength(5), - ADTaskState.INIT, + TaskState.INIT, Instant.now(), randomAlphaOfLength(5), TestHelpers.randomAnomalyDetectorUsingCategoryFields(randomAlphaOfLength(5), ImmutableList.of(randomAlphaOfLength(5))) @@ -540,7 +531,7 @@ public void testCheckTaskSlotsWithAvailableTaskSlotsAndNoEntity() throws IOExcep public void testCheckTaskSlotsWithAvailableTaskSlotsForScale() throws IOException { ADTask adTask = randomAdTask( randomAlphaOfLength(5), - ADTaskState.INIT, + TaskState.INIT, Instant.now(), randomAlphaOfLength(5), TestHelpers.randomAnomalyDetectorUsingCategoryFields(randomAlphaOfLength(5), ImmutableList.of(randomAlphaOfLength(5))) @@ -572,7 +563,7 @@ public void testDeleteDuplicateTasks() throws IOException { public void testParseEntityForSingleCategoryHC() throws IOException { ADTask adTask = randomAdTask( randomAlphaOfLength(5), - ADTaskState.INIT, + TaskState.INIT, Instant.now(), randomAlphaOfLength(5), TestHelpers.randomAnomalyDetectorUsingCategoryFields(randomAlphaOfLength(5), ImmutableList.of(randomAlphaOfLength(5))) @@ -585,7 +576,7 @@ public void testParseEntityForSingleCategoryHC() throws IOException { public void testParseEntityForMultiCategoryHC() throws IOException { ADTask adTask = randomAdTask( randomAlphaOfLength(5), - ADTaskState.INIT, + TaskState.INIT, Instant.now(), randomAlphaOfLength(5), TestHelpers @@ -647,7 +638,7 @@ public void testGetADTaskWithNotExistTask() { ActionListener listener = invocation.getArgument(1); GetResponse response = new GetResponse( new GetResult( - AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, + CommonName.JOB_INDEX, taskId, UNASSIGNED_SEQ_NO, 0, @@ -703,7 +694,7 @@ public void testGetADTaskWithExistingTask() { ADTask adTask = randomAdTask(); GetResponse response = new GetResponse( new GetResult( - AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, + CommonName.JOB_INDEX, taskId, UNASSIGNED_SEQ_NO, 0, @@ -725,7 +716,7 @@ public void testGetADTaskWithExistingTask() { @SuppressWarnings("unchecked") public void testUpdateLatestRealtimeTaskOnCoordinatingNode() { String detectorId = randomAlphaOfLength(5); - String state = ADTaskState.RUNNING.name(); + String state = TaskState.RUNNING.name(); Long rcfTotalUpdates = randomLongBetween(200, 1000); Long detectorIntervalInMinutes = 1L; String error = randomAlphaOfLength(5); @@ -784,7 +775,7 @@ public void testGetLocalADTaskProfilesByDetectorId() { @SuppressWarnings("unchecked") public void testRemoveStaleRunningEntity() throws IOException { - ActionListener actionListener = mock(ActionListener.class); + ActionListener actionListener = mock(ActionListener.class); ADTask adTask = randomAdTask(); String entity = randomAlphaOfLength(5); ExecutorService executeService = mock(ExecutorService.class); @@ -833,14 +824,14 @@ public void testResetLatestFlagAsFalse() throws IOException { } public void testCleanADResultOfDeletedDetectorWithNoDeletedDetector() { - when(adTaskCacheManager.pollDeletedDetector()).thenReturn(null); + when(adTaskCacheManager.pollDeletedConfig()).thenReturn(null); adTaskManager.cleanADResultOfDeletedDetector(); verify(client, never()).execute(eq(DeleteByQueryAction.INSTANCE), any(), any()); } public void testCleanADResultOfDeletedDetectorWithException() { String detectorId = randomAlphaOfLength(5); - when(adTaskCacheManager.pollDeletedDetector()).thenReturn(detectorId); + when(adTaskCacheManager.pollDeletedConfig()).thenReturn(detectorId); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(2); @@ -857,7 +848,7 @@ public void testCleanADResultOfDeletedDetectorWithException() { .builder() .put(MAX_OLD_AD_TASK_DOCS_PER_DETECTOR.getKey(), 2) .put(BATCH_TASK_PIECE_INTERVAL_SECONDS.getKey(), 1) - .put(REQUEST_TIMEOUT.getKey(), TimeValue.timeValueSeconds(10)) + .put(AD_REQUEST_TIMEOUT.getKey(), TimeValue.timeValueSeconds(10)) .put(DELETE_AD_RESULT_WHEN_DELETE_DETECTOR.getKey(), true) .build(); @@ -865,7 +856,7 @@ public void testCleanADResultOfDeletedDetectorWithException() { settings, MAX_OLD_AD_TASK_DOCS_PER_DETECTOR, BATCH_TASK_PIECE_INTERVAL_SECONDS, - REQUEST_TIMEOUT, + AD_REQUEST_TIMEOUT, DELETE_AD_RESULT_WHEN_DELETE_DETECTOR, MAX_BATCH_TASK_PER_NODE, MAX_RUNNING_ENTITIES_PER_DETECTOR_FOR_HISTORICAL_ANALYSIS @@ -888,11 +879,11 @@ public void testCleanADResultOfDeletedDetectorWithException() { ); adTaskManager.cleanADResultOfDeletedDetector(); verify(client, times(1)).execute(eq(DeleteByQueryAction.INSTANCE), any(), any()); - verify(adTaskCacheManager, times(1)).addDeletedDetector(eq(detectorId)); + verify(adTaskCacheManager, times(1)).addDeletedConfig(eq(detectorId)); adTaskManager.cleanADResultOfDeletedDetector(); verify(client, times(2)).execute(eq(DeleteByQueryAction.INSTANCE), any(), any()); - verify(adTaskCacheManager, times(1)).addDeletedDetector(eq(detectorId)); + verify(adTaskCacheManager, times(1)).addDeletedConfig(eq(detectorId)); } public void testMaintainRunningHistoricalTasksWithOwningNodeIsNotLocalNode() { @@ -997,11 +988,11 @@ public void testMaintainRunningRealtimeTasks() { when(adTaskCacheManager.getDetectorIdsInRealtimeTaskCache()).thenReturn(new String[] { detectorId1, detectorId2, detectorId3 }); when(adTaskCacheManager.getRealtimeTaskCache(detectorId1)).thenReturn(null); - ADRealtimeTaskCache cacheOfDetector2 = mock(ADRealtimeTaskCache.class); + RealtimeTaskCache cacheOfDetector2 = mock(RealtimeTaskCache.class); when(cacheOfDetector2.expired()).thenReturn(false); when(adTaskCacheManager.getRealtimeTaskCache(detectorId2)).thenReturn(cacheOfDetector2); - ADRealtimeTaskCache cacheOfDetector3 = mock(ADRealtimeTaskCache.class); + RealtimeTaskCache cacheOfDetector3 = mock(RealtimeTaskCache.class); when(cacheOfDetector3.expired()).thenReturn(true); when(adTaskCacheManager.getRealtimeTaskCache(detectorId3)).thenReturn(cacheOfDetector3); @@ -1012,10 +1003,10 @@ public void testMaintainRunningRealtimeTasks() { @SuppressWarnings("unchecked") public void testStartHistoricalAnalysisWithNoOwningNode() throws IOException { AnomalyDetector detector = TestHelpers.randomAnomalyDetector(ImmutableList.of()); - DetectionDateRange detectionDateRange = TestHelpers.randomDetectionDateRange(); + DateRange detectionDateRange = TestHelpers.randomDetectionDateRange(); User user = null; int availableTaskSlots = randomIntBetween(1, 10); - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); doAnswer(invocation -> { Consumer> function = invocation.getArgument(1); function.accept(Optional.empty()); @@ -1041,10 +1032,10 @@ public void testGetAndExecuteOnLatestADTasksWithRunningRealtimeTaskWithTaskStopp .builder() .taskId(randomAlphaOfLength(5)) .taskType(ADTaskType.HISTORICAL_HC_DETECTOR.name()) - .detectorId(randomAlphaOfLength(5)) + .configId(randomAlphaOfLength(5)) .detector(detector) .entity(null) - .state(ADTaskState.RUNNING.name()) + .state(TaskState.RUNNING.name()) .taskProgress(0.5f) .initProgress(1.0f) .currentPiece(Instant.now().truncatedTo(ChronoUnit.SECONDS).minus(randomIntBetween(1, 100), ChronoUnit.MINUTES)) @@ -1107,10 +1098,10 @@ public void testGetAndExecuteOnLatestADTasksWithRunningHistoricalTask() throws I .builder() .taskId(historicalTaskId) .taskType(ADTaskType.HISTORICAL_HC_DETECTOR.name()) - .detectorId(randomAlphaOfLength(5)) + .configId(randomAlphaOfLength(5)) .detector(detector) .entity(null) - .state(ADTaskState.RUNNING.name()) + .state(TaskState.RUNNING.name()) .taskProgress(0.5f) .initProgress(1.0f) .currentPiece(Instant.now().truncatedTo(ChronoUnit.SECONDS).minus(randomIntBetween(1, 100), ChronoUnit.MINUTES)) @@ -1196,7 +1187,7 @@ private void setupGetAndExecuteOnLatestADTasks(ADTaskProfile adTaskProfile) { }).when(client).search(any(), any()); String detectorId = randomAlphaOfLength(5); Consumer> function = mock(Consumer.class); - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); doAnswer(invocation -> { Consumer getNodeFunction = invocation.getArgument(0); @@ -1239,7 +1230,7 @@ private void setupGetAndExecuteOnLatestADTasks(ADTaskProfile adTaskProfile) { ActionListener getResponselistener = invocation.getArgument(1); GetResponse response = new GetResponse( new GetResult( - AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, + CommonName.JOB_INDEX, detectorId, UNASSIGNED_SEQ_NO, 0, @@ -1247,7 +1238,7 @@ private void setupGetAndExecuteOnLatestADTasks(ADTaskProfile adTaskProfile) { true, BytesReference .bytes( - new AnomalyDetectorJob( + new Job( detectorId, randomIntervalSchedule(), randomIntervalTimeConfiguration(), @@ -1289,14 +1280,14 @@ public void testCreateADTaskDirectlyWithException() throws IOException { } public void testCleanChildTasksAndADResultsOfDeletedTaskWithNoDeletedDetectorTask() { - when(adTaskCacheManager.hasDeletedDetectorTask()).thenReturn(false); + when(adTaskCacheManager.hasDeletedTask()).thenReturn(false); adTaskManager.cleanChildTasksAndADResultsOfDeletedTask(); verify(client, never()).execute(any(), any(), any()); } public void testCleanChildTasksAndADResultsOfDeletedTaskWithNullTask() { - when(adTaskCacheManager.hasDeletedDetectorTask()).thenReturn(true); - when(adTaskCacheManager.pollDeletedDetectorTask()).thenReturn(null); + when(adTaskCacheManager.hasDeletedTask()).thenReturn(true); + when(adTaskCacheManager.pollDeletedTask()).thenReturn(null); doAnswer(invocation -> { ActionListener actionListener = invocation.getArgument(2); actionListener.onFailure(new RuntimeException("test")); @@ -1314,8 +1305,8 @@ public void testCleanChildTasksAndADResultsOfDeletedTaskWithNullTask() { } public void testCleanChildTasksAndADResultsOfDeletedTaskWithFailToDeleteADResult() { - when(adTaskCacheManager.hasDeletedDetectorTask()).thenReturn(true); - when(adTaskCacheManager.pollDeletedDetectorTask()).thenReturn(randomAlphaOfLength(5)); + when(adTaskCacheManager.hasDeletedTask()).thenReturn(true); + when(adTaskCacheManager.pollDeletedTask()).thenReturn(randomAlphaOfLength(5)); doAnswer(invocation -> { ActionListener actionListener = invocation.getArgument(2); actionListener.onFailure(new RuntimeException("test")); @@ -1333,8 +1324,8 @@ public void testCleanChildTasksAndADResultsOfDeletedTaskWithFailToDeleteADResult } public void testCleanChildTasksAndADResultsOfDeletedTask() { - when(adTaskCacheManager.hasDeletedDetectorTask()).thenReturn(true); - when(adTaskCacheManager.pollDeletedDetectorTask()).thenReturn(randomAlphaOfLength(5)).thenReturn(null); + when(adTaskCacheManager.hasDeletedTask()).thenReturn(true); + when(adTaskCacheManager.pollDeletedTask()).thenReturn(randomAlphaOfLength(5)).thenReturn(null); doAnswer(invocation -> { ActionListener actionListener = invocation.getArgument(2); BulkByScrollResponse response = mock(BulkByScrollResponse.class); @@ -1363,7 +1354,7 @@ public void testDeleteADTasks() { }).when(client).execute(any(), any(), any()); String detectorId = randomAlphaOfLength(5); - AnomalyDetectorFunction function = mock(AnomalyDetectorFunction.class); + ExecutorFunction function = mock(ExecutorFunction.class); ActionListener listener = mock(ActionListener.class); adTaskManager.deleteADTasks(detectorId, function, listener); verify(function, times(1)).execute(); @@ -1388,7 +1379,7 @@ public void testDeleteADTasksWithBulkFailures() { }).when(client).execute(any(), any(), any()); String detectorId = randomAlphaOfLength(5); - AnomalyDetectorFunction function = mock(AnomalyDetectorFunction.class); + ExecutorFunction function = mock(ExecutorFunction.class); ActionListener listener = mock(ActionListener.class); adTaskManager.deleteADTasks(detectorId, function, listener); verify(listener, times(1)).onFailure(any()); @@ -1407,7 +1398,7 @@ public void testDeleteADTasksWithException() { }).when(client).execute(any(), any(), any()); String detectorId = randomAlphaOfLength(5); - AnomalyDetectorFunction function = mock(AnomalyDetectorFunction.class); + ExecutorFunction function = mock(ExecutorFunction.class); ActionListener listener = mock(ActionListener.class); adTaskManager.deleteADTasks(detectorId, function, listener); @@ -1422,7 +1413,7 @@ public void testDeleteADTasksWithException() { @SuppressWarnings("unchecked") public void testScaleUpTaskSlots() throws IOException { ADTask adTask = randomAdTask(ADTaskType.HISTORICAL_HC_ENTITY); - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); when(adTaskCacheManager.getAvailableNewEntityTaskLanes(anyString())).thenReturn(0); doReturn(2).when(adTaskManager).detectorTaskSlotScaleDelta(anyString()); when(adTaskCacheManager.getLastScaleEntityTaskLaneTime(anyString())).thenReturn(null); @@ -1442,7 +1433,7 @@ public void testScaleUpTaskSlots() throws IOException { public void testForwardRequestToLeadNodeWithNotExistingNode() throws IOException { ADTask adTask = randomAdTask(ADTaskType.HISTORICAL_HC_ENTITY); ForwardADTaskRequest forwardADTaskRequest = new ForwardADTaskRequest(adTask, ADTaskAction.APPLY_FOR_TASK_SLOTS); - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); doAnswer(invocation -> { Consumer> function = invocation.getArgument(1); function.accept(Optional.empty()); @@ -1458,18 +1449,18 @@ public void testScaleTaskLaneOnCoordinatingNode() { ADTask adTask = mock(ADTask.class); when(adTask.getCoordinatingNode()).thenReturn(node1.getId()); when(nodeFilter.getEligibleDataNodes()).thenReturn(new DiscoveryNode[] { node1, node2 }); - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); adTaskManager.scaleTaskLaneOnCoordinatingNode(adTask, 2, transportService, listener); } @SuppressWarnings("unchecked") public void testStartDetectorWithException() throws IOException { AnomalyDetector detector = randomAnomalyDetector(ImmutableList.of(randomFeature(true))); - DetectionDateRange detectionDateRange = randomDetectionDateRange(); + DateRange detectionDateRange = randomDetectionDateRange(); User user = null; - ActionListener listener = mock(ActionListener.class); - when(detectionIndices.doesDetectorStateIndexExist()).thenReturn(false); - doThrow(new RuntimeException("test")).when(detectionIndices).initDetectionStateIndex(any()); + ActionListener listener = mock(ActionListener.class); + when(detectionIndices.doesStateIndexExist()).thenReturn(false); + doThrow(new RuntimeException("test")).when(detectionIndices).initStateIndex(any()); adTaskManager.startDetector(detector, detectionDateRange, user, transportService, listener); verify(listener, times(1)).onFailure(any()); } @@ -1478,7 +1469,7 @@ public void testStartDetectorWithException() throws IOException { public void testStopDetectorWithNonExistingDetector() { String detectorId = randomAlphaOfLength(5); boolean historical = true; - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); doAnswer(invocation -> { Consumer> function = invocation.getArgument(1); function.accept(Optional.empty()); @@ -1492,7 +1483,7 @@ public void testStopDetectorWithNonExistingDetector() { public void testStopDetectorWithNonExistingTask() { String detectorId = randomAlphaOfLength(5); boolean historical = true; - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); doAnswer(invocation -> { Consumer> function = invocation.getArgument(1); AnomalyDetector detector = randomAnomalyDetector(ImmutableList.of(randomFeature(true))); @@ -1514,7 +1505,7 @@ public void testStopDetectorWithNonExistingTask() { public void testStopDetectorWithTaskDone() { String detectorId = randomAlphaOfLength(5); boolean historical = true; - ActionListener listener = mock(ActionListener.class); + ActionListener listener = mock(ActionListener.class); doAnswer(invocation -> { Consumer> function = invocation.getArgument(1); AnomalyDetector detector = randomAnomalyDetector(ImmutableList.of(randomFeature(true))); @@ -1562,7 +1553,7 @@ public void testGetDetectorWithWrongContent() { ActionListener responseListener = invocation.getArgument(1); GetResponse response = new GetResponse( new GetResult( - AnomalyDetector.ANOMALY_DETECTORS_INDEX, + CommonName.CONFIG_INDEX, detectorId, UNASSIGNED_SEQ_NO, 0, @@ -1628,10 +1619,10 @@ public void testDeleteTaskDocs() { String detectorId = randomAlphaOfLength(5); SearchRequest searchRequest = mock(SearchRequest.class); - AnomalyDetectorFunction function = mock(AnomalyDetectorFunction.class); + ExecutorFunction function = mock(ExecutorFunction.class); ActionListener listener = mock(ActionListener.class); adTaskManager.deleteTaskDocs(detectorId, searchRequest, function, listener); - verify(adTaskCacheManager, times(1)).addDeletedDetectorTask(anyString()); + verify(adTaskCacheManager, times(1)).addDeletedTask(anyString()); verify(function, times(1)).execute(); } } diff --git a/src/test/java/org/opensearch/ad/transport/ADBatchAnomalyResultRequestTests.java b/src/test/java/org/opensearch/ad/transport/ADBatchAnomalyResultRequestTests.java index 0ee1efd73..dd200f8f4 100644 --- a/src/test/java/org/opensearch/ad/transport/ADBatchAnomalyResultRequestTests.java +++ b/src/test/java/org/opensearch/ad/transport/ADBatchAnomalyResultRequestTests.java @@ -14,9 +14,9 @@ import java.io.IOException; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.model.ADTask; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.TestHelpers; public class ADBatchAnomalyResultRequestTests extends OpenSearchTestCase { diff --git a/src/test/java/org/opensearch/ad/transport/ADBatchAnomalyResultTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/ADBatchAnomalyResultTransportActionTests.java index e8ef1f945..8ce30df12 100644 --- a/src/test/java/org/opensearch/ad/transport/ADBatchAnomalyResultTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/ADBatchAnomalyResultTransportActionTests.java @@ -11,10 +11,10 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.TestHelpers.HISTORICAL_ANALYSIS_FINISHED_FAILED_STATS; +import static org.opensearch.ad.settings.ADEnabledSetting.AD_ENABLED; import static org.opensearch.ad.settings.AnomalyDetectorSettings.BATCH_TASK_PIECE_INTERVAL_SECONDS; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE; -import static org.opensearch.ad.settings.EnabledSetting.AD_PLUGIN_ENABLED; +import static org.opensearch.timeseries.TestHelpers.HISTORICAL_ANALYSIS_FINISHED_FAILED_STATS; import java.io.IOException; import java.time.Instant; @@ -25,16 +25,16 @@ import org.opensearch.action.ActionRequestValidationException; import org.opensearch.action.get.GetResponse; import org.opensearch.ad.HistoricalAnalysisIntegTestCase; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.DetectionDateRange; -import org.opensearch.ad.util.ExceptionUtil; import org.opensearch.common.settings.Settings; import org.opensearch.core.common.io.stream.NotSerializableExceptionWrapper; import org.opensearch.test.OpenSearchIntegTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.util.ExceptionUtil; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -47,7 +47,7 @@ public class ADBatchAnomalyResultTransportActionTests extends HistoricalAnalysis private Instant endTime; private String type = "error"; private int detectionIntervalInMinutes = 1; - private DetectionDateRange dateRange; + private DateRange dateRange; @Override @Before @@ -56,7 +56,7 @@ public void setUp() throws Exception { testIndex = "test_historical_data"; startTime = Instant.now().minus(10, ChronoUnit.DAYS); endTime = Instant.now(); - dateRange = new DetectionDateRange(endTime, endTime.plus(10, ChronoUnit.DAYS)); + dateRange = new DateRange(endTime, endTime.plus(10, ChronoUnit.DAYS)); ingestTestData(testIndex, startTime, detectionIntervalInMinutes, type); createDetectionStateIndex(); } @@ -82,46 +82,43 @@ public void testAnomalyDetectorWithNullDetector() { } public void testHistoricalAnalysisWithFutureDateRange() throws IOException, InterruptedException { - DetectionDateRange dateRange = new DetectionDateRange(endTime, endTime.plus(10, ChronoUnit.DAYS)); + DateRange dateRange = new DateRange(endTime, endTime.plus(10, ChronoUnit.DAYS)); testInvalidDetectionDateRange(dateRange); } public void testHistoricalAnalysisWithInvalidHistoricalDateRange() throws IOException, InterruptedException { - DetectionDateRange dateRange = new DetectionDateRange(startTime.minus(10, ChronoUnit.DAYS), startTime); + DateRange dateRange = new DateRange(startTime.minus(10, ChronoUnit.DAYS), startTime); testInvalidDetectionDateRange(dateRange); } public void testHistoricalAnalysisWithSmallHistoricalDateRange() throws IOException, InterruptedException { - DetectionDateRange dateRange = new DetectionDateRange(startTime, startTime.plus(10, ChronoUnit.MINUTES)); + DateRange dateRange = new DateRange(startTime, startTime.plus(10, ChronoUnit.MINUTES)); testInvalidDetectionDateRange(dateRange, "There is not enough data to train model"); } public void testHistoricalAnalysisWithValidDateRange() throws IOException, InterruptedException { - DetectionDateRange dateRange = new DetectionDateRange(startTime, endTime); + DateRange dateRange = new DateRange(startTime, endTime); ADBatchAnomalyResultRequest request = adBatchAnomalyResultRequest(dateRange); client().execute(ADBatchAnomalyResultAction.INSTANCE, request).actionGet(5000); Thread.sleep(20000); - GetResponse doc = getDoc(CommonName.DETECTION_STATE_INDEX, request.getAdTask().getTaskId()); + GetResponse doc = getDoc(ADCommonName.DETECTION_STATE_INDEX, request.getAdTask().getTaskId()); assertTrue(HISTORICAL_ANALYSIS_FINISHED_FAILED_STATS.contains(doc.getSourceAsMap().get(ADTask.STATE_FIELD))); } public void testHistoricalAnalysisWithNonExistingIndex() throws IOException { - ADBatchAnomalyResultRequest request = adBatchAnomalyResultRequest( - new DetectionDateRange(startTime, endTime), - randomAlphaOfLength(5) - ); + ADBatchAnomalyResultRequest request = adBatchAnomalyResultRequest(new DateRange(startTime, endTime), randomAlphaOfLength(5)); client().execute(ADBatchAnomalyResultAction.INSTANCE, request).actionGet(10_000); } public void testHistoricalAnalysisExceedsMaxRunningTaskLimit() throws IOException, InterruptedException { updateTransientSettings(ImmutableMap.of(MAX_BATCH_TASK_PER_NODE.getKey(), 1)); updateTransientSettings(ImmutableMap.of(BATCH_TASK_PIECE_INTERVAL_SECONDS.getKey(), 5)); - DetectionDateRange dateRange = new DetectionDateRange(startTime, endTime); + DateRange dateRange = new DateRange(startTime, endTime); int totalDataNodes = getDataNodes().size(); for (int i = 0; i < totalDataNodes; i++) { client().execute(ADBatchAnomalyResultAction.INSTANCE, adBatchAnomalyResultRequest(dateRange)).actionGet(5000); } - waitUntil(() -> countDocs(CommonName.DETECTION_STATE_INDEX) >= totalDataNodes, 10, TimeUnit.SECONDS); + waitUntil(() -> countDocs(ADCommonName.DETECTION_STATE_INDEX) >= totalDataNodes, 10, TimeUnit.SECONDS); ADBatchAnomalyResultRequest request = adBatchAnomalyResultRequest(dateRange); try { @@ -137,43 +134,43 @@ public void testHistoricalAnalysisExceedsMaxRunningTaskLimit() throws IOExceptio public void testDisableADPlugin() throws IOException { try { - updateTransientSettings(ImmutableMap.of(AD_PLUGIN_ENABLED, false)); - ADBatchAnomalyResultRequest request = adBatchAnomalyResultRequest(new DetectionDateRange(startTime, endTime)); + updateTransientSettings(ImmutableMap.of(AD_ENABLED, false)); + ADBatchAnomalyResultRequest request = adBatchAnomalyResultRequest(new DateRange(startTime, endTime)); RuntimeException exception = expectThrowsAnyOf( ImmutableList.of(NotSerializableExceptionWrapper.class, EndRunException.class), () -> client().execute(ADBatchAnomalyResultAction.INSTANCE, request).actionGet(10000) ); - assertTrue(exception.getMessage(), exception.getMessage().contains("AD plugin is disabled")); - updateTransientSettings(ImmutableMap.of(AD_PLUGIN_ENABLED, false)); + assertTrue(exception.getMessage(), exception.getMessage().contains("AD functionality is disabled")); + updateTransientSettings(ImmutableMap.of(AD_ENABLED, false)); } finally { // guarantee reset back to default - updateTransientSettings(ImmutableMap.of(AD_PLUGIN_ENABLED, true)); + updateTransientSettings(ImmutableMap.of(AD_ENABLED, true)); } } public void testMultipleTasks() throws IOException, InterruptedException { updateTransientSettings(ImmutableMap.of(MAX_BATCH_TASK_PER_NODE.getKey(), 2)); - DetectionDateRange dateRange = new DetectionDateRange(startTime, endTime); + DateRange dateRange = new DateRange(startTime, endTime); for (int i = 0; i < getDataNodes().size(); i++) { client().execute(ADBatchAnomalyResultAction.INSTANCE, adBatchAnomalyResultRequest(dateRange)); } ADBatchAnomalyResultRequest request = adBatchAnomalyResultRequest( - new DetectionDateRange(startTime, startTime.plus(2000, ChronoUnit.MINUTES)) + new DateRange(startTime, startTime.plus(2000, ChronoUnit.MINUTES)) ); client().execute(ADBatchAnomalyResultAction.INSTANCE, request).actionGet(5000); Thread.sleep(25000); - GetResponse doc = getDoc(CommonName.DETECTION_STATE_INDEX, request.getAdTask().getTaskId()); + GetResponse doc = getDoc(ADCommonName.DETECTION_STATE_INDEX, request.getAdTask().getTaskId()); assertTrue(HISTORICAL_ANALYSIS_FINISHED_FAILED_STATS.contains(doc.getSourceAsMap().get(ADTask.STATE_FIELD))); updateTransientSettings(ImmutableMap.of(MAX_BATCH_TASK_PER_NODE.getKey(), 1)); } - private ADBatchAnomalyResultRequest adBatchAnomalyResultRequest(DetectionDateRange dateRange) throws IOException { + private ADBatchAnomalyResultRequest adBatchAnomalyResultRequest(DateRange dateRange) throws IOException { return adBatchAnomalyResultRequest(dateRange, testIndex); } - private ADBatchAnomalyResultRequest adBatchAnomalyResultRequest(DetectionDateRange dateRange, String indexName) throws IOException { + private ADBatchAnomalyResultRequest adBatchAnomalyResultRequest(DateRange dateRange, String indexName) throws IOException { AnomalyDetector detector = TestHelpers .randomDetector(ImmutableList.of(maxValueFeature()), indexName, detectionIntervalInMinutes, timeField); ADTask adTask = randomCreatedADTask(randomAlphaOfLength(5), detector, dateRange); @@ -181,15 +178,15 @@ private ADBatchAnomalyResultRequest adBatchAnomalyResultRequest(DetectionDateRan return new ADBatchAnomalyResultRequest(adTask); } - private void testInvalidDetectionDateRange(DetectionDateRange dateRange) throws IOException, InterruptedException { + private void testInvalidDetectionDateRange(DateRange dateRange) throws IOException, InterruptedException { testInvalidDetectionDateRange(dateRange, "There is no data in the detection date range"); } - private void testInvalidDetectionDateRange(DetectionDateRange dateRange, String error) throws IOException, InterruptedException { + private void testInvalidDetectionDateRange(DateRange dateRange, String error) throws IOException, InterruptedException { ADBatchAnomalyResultRequest request = adBatchAnomalyResultRequest(dateRange); client().execute(ADBatchAnomalyResultAction.INSTANCE, request).actionGet(5000); Thread.sleep(5000); - GetResponse doc = getDoc(CommonName.DETECTION_STATE_INDEX, request.getAdTask().getTaskId()); + GetResponse doc = getDoc(ADCommonName.DETECTION_STATE_INDEX, request.getAdTask().getTaskId()); assertEquals(error, doc.getSourceAsMap().get(ADTask.ERROR_FIELD)); } } diff --git a/src/test/java/org/opensearch/ad/transport/ADCancelTaskNodeRequestTests.java b/src/test/java/org/opensearch/ad/transport/ADCancelTaskNodeRequestTests.java index 5efcd271d..da3c2a7db 100644 --- a/src/test/java/org/opensearch/ad/transport/ADCancelTaskNodeRequestTests.java +++ b/src/test/java/org/opensearch/ad/transport/ADCancelTaskNodeRequestTests.java @@ -28,7 +28,7 @@ public void testParseOldADCancelTaskNodeRequestTest() throws IOException { oldRequest.writeTo(output); StreamInput input = output.bytes().streamInput(); ADCancelTaskNodeRequest parsedRequest = new ADCancelTaskNodeRequest(input); - assertEquals(detectorId, parsedRequest.getDetectorId()); + assertEquals(detectorId, parsedRequest.getId()); assertEquals(userName, parsedRequest.getUserName()); assertNull(parsedRequest.getDetectorTaskId()); assertNull(parsedRequest.getReason()); diff --git a/src/test/java/org/opensearch/ad/transport/ADCancelTaskTests.java b/src/test/java/org/opensearch/ad/transport/ADCancelTaskTests.java index 51e72e11b..fa0f857d6 100644 --- a/src/test/java/org/opensearch/ad/transport/ADCancelTaskTests.java +++ b/src/test/java/org/opensearch/ad/transport/ADCancelTaskTests.java @@ -11,14 +11,14 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.TestHelpers.randomDiscoveryNode; +import static org.opensearch.timeseries.TestHelpers.randomDiscoveryNode; import java.io.IOException; import java.util.List; import org.opensearch.action.ActionRequestValidationException; import org.opensearch.ad.ADUnitTestCase; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.task.ADTaskCancellationState; import org.opensearch.cluster.ClusterName; import org.opensearch.common.io.stream.BytesStreamOutput; @@ -41,14 +41,14 @@ public void testADCancelTaskRequest() throws IOException { request.writeTo(output); NamedWriteableAwareStreamInput input = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); ADCancelTaskRequest parsedRequest = new ADCancelTaskRequest(input); - assertEquals(request.getDetectorId(), parsedRequest.getDetectorId()); + assertEquals(request.getId(), parsedRequest.getId()); assertEquals(request.getUserName(), parsedRequest.getUserName()); } public void testInvalidADCancelTaskRequest() { ADCancelTaskRequest request = new ADCancelTaskRequest(null, null, null, randomDiscoveryNode()); ActionRequestValidationException validationException = request.validate(); - assertTrue(validationException.getMessage().contains(CommonErrorMessages.AD_ID_MISSING_MSG)); + assertTrue(validationException.getMessage().contains(ADCommonMessages.AD_ID_MISSING_MSG)); } public void testSerializeResponse() throws IOException { diff --git a/src/test/java/org/opensearch/ad/transport/ADResultBulkTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/ADResultBulkTransportActionTests.java index 643093fd5..9887f1aff 100644 --- a/src/test/java/org/opensearch/ad/transport/ADResultBulkTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/ADResultBulkTransportActionTests.java @@ -31,8 +31,6 @@ import org.opensearch.action.bulk.BulkResponse; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.PlainActionFuture; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; @@ -41,9 +39,11 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.index.IndexingPressure; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.TestHelpers; import org.opensearch.transport.TransportService; -public class ADResultBulkTransportActionTests extends AbstractADTest { +public class ADResultBulkTransportActionTests extends AbstractTimeSeriesTest { private ADResultBulkTransportAction resultBulk; private TransportService transportService; private ClusterService clusterService; @@ -68,11 +68,11 @@ public void setUp() throws Exception { Settings settings = Settings .builder() .put(IndexingPressure.MAX_INDEXING_BYTES.getKey(), "1KB") - .put(AnomalyDetectorSettings.INDEX_PRESSURE_SOFT_LIMIT.getKey(), 0.8) + .put(AnomalyDetectorSettings.AD_INDEX_PRESSURE_SOFT_LIMIT.getKey(), 0.8) .build(); // without register these settings, the constructor of ADResultBulkTransportAction cannot invoke update consumer - setupTestNodes(AnomalyDetectorSettings.INDEX_PRESSURE_SOFT_LIMIT, AnomalyDetectorSettings.INDEX_PRESSURE_HARD_LIMIT); + setupTestNodes(AnomalyDetectorSettings.AD_INDEX_PRESSURE_SOFT_LIMIT, AnomalyDetectorSettings.AD_INDEX_PRESSURE_HARD_LIMIT); transportService = testNodes[0].transportService; clusterService = testNodes[0].clusterService; diff --git a/src/test/java/org/opensearch/ad/transport/ADStatsITTests.java b/src/test/java/org/opensearch/ad/transport/ADStatsITTests.java index 57da69965..da8f3dce7 100644 --- a/src/test/java/org/opensearch/ad/transport/ADStatsITTests.java +++ b/src/test/java/org/opensearch/ad/transport/ADStatsITTests.java @@ -15,19 +15,19 @@ import java.util.Collections; import java.util.concurrent.ExecutionException; -import org.opensearch.ad.AnomalyDetectorPlugin; import org.opensearch.plugins.Plugin; import org.opensearch.test.OpenSearchIntegTestCase; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; public class ADStatsITTests extends OpenSearchIntegTestCase { @Override protected Collection> nodePlugins() { - return Collections.singletonList(AnomalyDetectorPlugin.class); + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); } protected Collection> transportClientPlugins() { - return Collections.singletonList(AnomalyDetectorPlugin.class); + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); } public void testNormalADStats() throws ExecutionException, InterruptedException { diff --git a/src/test/java/org/opensearch/ad/transport/ADStatsNodesTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/ADStatsNodesTransportActionTests.java index 95799f911..2284c311e 100644 --- a/src/test/java/org/opensearch/ad/transport/ADStatsNodesTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/ADStatsNodesTransportActionTests.java @@ -13,7 +13,7 @@ import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_MODEL_SIZE_PER_NODE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_MAX_MODEL_SIZE_PER_NODE; import java.time.Clock; import java.util.Arrays; @@ -37,9 +37,7 @@ import org.opensearch.ad.stats.suppliers.ModelsOnNodeSupplier; import org.opensearch.ad.stats.suppliers.SettableSupplier; import org.opensearch.ad.task.ADTaskManager; -import org.opensearch.ad.util.ClientUtil; import org.opensearch.ad.util.IndexUtils; -import org.opensearch.ad.util.Throttler; import org.opensearch.client.Client; import org.opensearch.cluster.metadata.IndexNameExpressionResolver; import org.opensearch.cluster.service.ClusterService; @@ -49,6 +47,7 @@ import org.opensearch.monitor.jvm.JvmStats; import org.opensearch.test.OpenSearchIntegTestCase; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.util.ClientUtil; import org.opensearch.transport.TransportService; public class ADStatsNodesTransportActionTests extends OpenSearchIntegTestCase { @@ -67,15 +66,9 @@ public void setUp() throws Exception { Client client = client(); Clock clock = mock(Clock.class); - Throttler throttler = new Throttler(clock); ThreadPool threadPool = mock(ThreadPool.class); IndexNameExpressionResolver indexNameResolver = mock(IndexNameExpressionResolver.class); - IndexUtils indexUtils = new IndexUtils( - client, - new ClientUtil(Settings.EMPTY, client, throttler, threadPool), - clusterService(), - indexNameResolver - ); + IndexUtils indexUtils = new IndexUtils(client, new ClientUtil(client), clusterService(), indexNameResolver); ModelManager modelManager = mock(ModelManager.class); CacheProvider cacheProvider = mock(CacheProvider.class); EntityCache cache = mock(EntityCache.class); @@ -86,11 +79,11 @@ public void setUp() throws Exception { nodeStatName1 = "nodeStat1"; nodeStatName2 = "nodeStat2"; - Settings settings = Settings.builder().put(MAX_MODEL_SIZE_PER_NODE.getKey(), 10).build(); + Settings settings = Settings.builder().put(AD_MAX_MODEL_SIZE_PER_NODE.getKey(), 10).build(); ClusterService clusterService = mock(ClusterService.class); ClusterSettings clusterSettings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(MAX_MODEL_SIZE_PER_NODE))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AD_MAX_MODEL_SIZE_PER_NODE))) ); when(clusterService.getClusterSettings()).thenReturn(clusterSettings); diff --git a/src/test/java/org/opensearch/ad/transport/ADStatsTests.java b/src/test/java/org/opensearch/ad/transport/ADStatsTests.java index 4d9f4f322..4836825f3 100644 --- a/src/test/java/org/opensearch/ad/transport/ADStatsTests.java +++ b/src/test/java/org/opensearch/ad/transport/ADStatsTests.java @@ -34,11 +34,8 @@ import org.opensearch.Version; import org.opensearch.action.FailedNodeException; import org.opensearch.ad.common.exception.JsonPathNotFoundException; -import org.opensearch.ad.constant.CommonName; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelState; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.stats.StatNames; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.common.io.stream.BytesStreamOutput; @@ -47,6 +44,9 @@ import org.opensearch.core.xcontent.ToXContent; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.stats.StatNames; import com.google.gson.JsonArray; import com.google.gson.JsonElement; diff --git a/src/test/java/org/opensearch/ad/transport/ADTaskProfileResponseTests.java b/src/test/java/org/opensearch/ad/transport/ADTaskProfileResponseTests.java index 8a28e8fb1..2654113d7 100644 --- a/src/test/java/org/opensearch/ad/transport/ADTaskProfileResponseTests.java +++ b/src/test/java/org/opensearch/ad/transport/ADTaskProfileResponseTests.java @@ -11,7 +11,7 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.TestHelpers.randomDiscoveryNode; +import static org.opensearch.timeseries.TestHelpers.randomDiscoveryNode; import java.io.IOException; import java.util.List; diff --git a/src/test/java/org/opensearch/ad/transport/ADTaskProfileTests.java b/src/test/java/org/opensearch/ad/transport/ADTaskProfileTests.java index f299c9854..d8e13e9ec 100644 --- a/src/test/java/org/opensearch/ad/transport/ADTaskProfileTests.java +++ b/src/test/java/org/opensearch/ad/transport/ADTaskProfileTests.java @@ -11,7 +11,7 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.TestHelpers.randomDiscoveryNode; +import static org.opensearch.timeseries.TestHelpers.randomDiscoveryNode; import java.io.IOException; import java.time.Instant; @@ -21,9 +21,7 @@ import org.junit.Ignore; import org.opensearch.Version; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.model.ADTaskProfile; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.node.DiscoveryNode; @@ -35,13 +33,15 @@ import org.opensearch.plugins.Plugin; import org.opensearch.test.InternalSettingsPlugin; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; import com.google.common.collect.ImmutableList; public class ADTaskProfileTests extends OpenSearchSingleNodeTestCase { @Override protected Collection> getPlugins() { - return pluginList(InternalSettingsPlugin.class, AnomalyDetectorPlugin.class); + return pluginList(InternalSettingsPlugin.class, TimeSeriesAnalyticsPlugin.class); } @Override @@ -56,14 +56,14 @@ public void testADTaskProfileRequest() throws IOException { request.writeTo(output); NamedWriteableAwareStreamInput input = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); ADTaskProfileRequest parsedRequest = new ADTaskProfileRequest(input); - assertEquals(request.getDetectorId(), parsedRequest.getDetectorId()); + assertEquals(request.getId(), parsedRequest.getId()); } public void testInvalidADTaskProfileRequest() { DiscoveryNode node = new DiscoveryNode(UUIDs.randomBase64UUID(), buildNewFakeTransportAddress(), Version.CURRENT); ADTaskProfileRequest request = new ADTaskProfileRequest(null, node); ActionRequestValidationException validationException = request.validate(); - assertTrue(validationException.getMessage().contains(CommonErrorMessages.AD_ID_MISSING_MSG)); + assertTrue(validationException.getMessage().contains(ADCommonMessages.AD_ID_MISSING_MSG)); } public void testADTaskProfileNodeResponse() throws IOException { @@ -165,12 +165,8 @@ public void testSerializeResponse() throws IOException { List adTaskProfileNodeResponses = response.readNodesFrom(input); assertEquals(1, adTaskProfileNodeResponses.size()); ADTaskProfileNodeResponse parsedProfile = adTaskProfileNodeResponses.get(0); - if (Version.CURRENT.onOrBefore(Version.V_1_0_0)) { - assertEquals(profile.getNodeId(), parsedProfile.getAdTaskProfile().getNodeId()); - assertNull(parsedProfile.getAdTaskProfile().getTaskId()); - } else { - assertEquals(profile.getTaskId(), parsedProfile.getAdTaskProfile().getTaskId()); - } + + assertEquals(profile.getTaskId(), parsedProfile.getAdTaskProfile().getTaskId()); } public void testADTaskProfileParseFullConstructor() throws IOException { diff --git a/src/test/java/org/opensearch/ad/transport/AnomalyDetectorJobActionTests.java b/src/test/java/org/opensearch/ad/transport/AnomalyDetectorJobActionTests.java index b0196a493..79bc66527 100644 --- a/src/test/java/org/opensearch/ad/transport/AnomalyDetectorJobActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/AnomalyDetectorJobActionTests.java @@ -25,8 +25,7 @@ import org.junit.Test; import org.opensearch.action.support.ActionFilters; import org.opensearch.ad.ExecuteADResultResponseRecorder; -import org.opensearch.ad.indices.AnomalyDetectionIndices; -import org.opensearch.ad.model.DetectionDateRange; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.task.ADTaskManager; import org.opensearch.client.Client; @@ -38,17 +37,18 @@ import org.opensearch.commons.ConfigConstants; import org.opensearch.core.action.ActionListener; import org.opensearch.core.common.io.stream.StreamInput; -import org.opensearch.core.rest.RestStatus; import org.opensearch.tasks.Task; import org.opensearch.test.OpenSearchIntegTestCase; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.transport.JobResponse; import org.opensearch.transport.TransportService; public class AnomalyDetectorJobActionTests extends OpenSearchIntegTestCase { private AnomalyDetectorJobTransportAction action; private Task task; private AnomalyDetectorJobRequest request; - private ActionListener response; + private ActionListener response; @Override @Before @@ -57,7 +57,7 @@ public void setUp() throws Exception { ClusterService clusterService = mock(ClusterService.class); ClusterSettings clusterSettings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES))) ); Settings build = Settings.builder().build(); @@ -75,16 +75,16 @@ public void setUp() throws Exception { client, clusterService, indexSettings(), - mock(AnomalyDetectionIndices.class), + mock(ADIndexManagement.class), xContentRegistry(), mock(ADTaskManager.class), mock(ExecuteADResultResponseRecorder.class) ); task = mock(Task.class); request = new AnomalyDetectorJobRequest("1234", 4567, 7890, "_start"); - response = new ActionListener() { + response = new ActionListener() { @Override - public void onResponse(AnomalyDetectorJobResponse adResponse) { + public void onResponse(JobResponse adResponse) { // Will not be called as there is no detector Assert.assertTrue(false); } @@ -116,7 +116,7 @@ public void testAdJobAction() { @Test public void testAdJobRequest() throws IOException { - DetectionDateRange detectionDateRange = new DetectionDateRange(Instant.MIN, Instant.now()); + DateRange detectionDateRange = new DateRange(Instant.MIN, Instant.now()); request = new AnomalyDetectorJobRequest("1234", detectionDateRange, false, 4567, 7890, "_start"); BytesStreamOutput out = new BytesStreamOutput(); @@ -138,10 +138,10 @@ public void testAdJobRequest_NullDetectionDateRange() throws IOException { @Test public void testAdJobResponse() throws IOException { BytesStreamOutput out = new BytesStreamOutput(); - AnomalyDetectorJobResponse response = new AnomalyDetectorJobResponse("1234", 45, 67, 890, RestStatus.OK); + JobResponse response = new JobResponse("1234"); response.writeTo(out); StreamInput input = out.bytes().streamInput(); - AnomalyDetectorJobResponse newResponse = new AnomalyDetectorJobResponse(input); + JobResponse newResponse = new JobResponse(input); Assert.assertEquals(response.getId(), newResponse.getId()); } } diff --git a/src/test/java/org/opensearch/ad/transport/AnomalyDetectorJobTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/AnomalyDetectorJobTransportActionTests.java index ba87e5faa..6f7629039 100644 --- a/src/test/java/org/opensearch/ad/transport/AnomalyDetectorJobTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/AnomalyDetectorJobTransportActionTests.java @@ -11,17 +11,17 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.TestHelpers.HISTORICAL_ANALYSIS_FINISHED_FAILED_STATS; -import static org.opensearch.ad.constant.CommonErrorMessages.DETECTOR_IS_RUNNING; -import static org.opensearch.ad.constant.CommonErrorMessages.FAIL_TO_FIND_DETECTOR_MSG; +import static org.opensearch.ad.constant.ADCommonMessages.DETECTOR_IS_RUNNING; import static org.opensearch.ad.settings.AnomalyDetectorSettings.BATCH_TASK_PIECE_INTERVAL_SECONDS; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_BATCH_TASK_PER_NODE; import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_OLD_AD_TASK_DOCS_PER_DETECTOR; -import static org.opensearch.ad.util.RestHandlerUtils.PROFILE; -import static org.opensearch.ad.util.RestHandlerUtils.START_JOB; -import static org.opensearch.ad.util.RestHandlerUtils.STOP_JOB; import static org.opensearch.index.seqno.SequenceNumbers.UNASSIGNED_PRIMARY_TERM; import static org.opensearch.index.seqno.SequenceNumbers.UNASSIGNED_SEQ_NO; +import static org.opensearch.timeseries.TestHelpers.HISTORICAL_ANALYSIS_FINISHED_FAILED_STATS; +import static org.opensearch.timeseries.constant.CommonMessages.FAIL_TO_FIND_CONFIG_MSG; +import static org.opensearch.timeseries.util.RestHandlerUtils.PROFILE; +import static org.opensearch.timeseries.util.RestHandlerUtils.START_JOB; +import static org.opensearch.timeseries.util.RestHandlerUtils.STOP_JOB; import java.io.IOException; import java.time.Instant; @@ -40,24 +40,26 @@ import org.opensearch.OpenSearchStatusException; import org.opensearch.action.get.GetResponse; import org.opensearch.ad.HistoricalAnalysisIntegTestCase; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.mock.model.MockSimpleLog; import org.opensearch.ad.mock.transport.MockAnomalyDetectorJobAction; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.ADTaskProfile; -import org.opensearch.ad.model.ADTaskState; import org.opensearch.ad.model.ADTaskType; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.model.DetectionDateRange; -import org.opensearch.ad.stats.StatNames; import org.opensearch.client.Client; import org.opensearch.common.lucene.uid.Versions; import org.opensearch.common.settings.Settings; import org.opensearch.core.action.ActionListener; import org.opensearch.index.IndexNotFoundException; import org.opensearch.test.OpenSearchIntegTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.model.TaskState; +import org.opensearch.timeseries.stats.StatNames; +import org.opensearch.timeseries.transport.JobResponse; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -69,7 +71,7 @@ public class AnomalyDetectorJobTransportActionTests extends HistoricalAnalysisIn private Instant endTime; private String type = "error"; private int maxOldAdTaskDocsPerDetector = 2; - private DetectionDateRange dateRange; + private DateRange dateRange; @Override @Before @@ -77,7 +79,7 @@ public void setUp() throws Exception { super.setUp(); startTime = Instant.now().minus(10, ChronoUnit.DAYS); endTime = Instant.now(); - dateRange = new DetectionDateRange(startTime, endTime); + dateRange = new DateRange(startTime, endTime); ingestTestData(testIndex, startTime, detectionIntervalInMinutes, type, 2000); createDetectorIndex(); } @@ -111,7 +113,7 @@ public void testDetectorNotFound() { OpenSearchStatusException.class, () -> client().execute(AnomalyDetectorJobAction.INSTANCE, request).actionGet(10000) ); - assertTrue(exception.getMessage().contains(FAIL_TO_FIND_DETECTOR_MSG)); + assertTrue(exception.getMessage().contains(FAIL_TO_FIND_CONFIG_MSG)); } public void testValidHistoricalAnalysis() throws IOException, InterruptedException { @@ -135,7 +137,7 @@ public void testStartHistoricalAnalysisWithUser() throws IOException { ); Client nodeClient = getDataNodeClient(); if (nodeClient != null) { - AnomalyDetectorJobResponse response = nodeClient.execute(MockAnomalyDetectorJobAction.INSTANCE, request).actionGet(100000); + JobResponse response = nodeClient.execute(MockAnomalyDetectorJobAction.INSTANCE, request).actionGet(100000); ADTask adTask = getADTask(response.getId()); assertNotNull(adTask.getStartedBy()); assertNotNull(adTask.getUser()); @@ -165,7 +167,7 @@ public void testStartHistoricalAnalysisForSingleCategoryHCWithUser() throws IOEx Client nodeClient = getDataNodeClient(); if (nodeClient != null) { - AnomalyDetectorJobResponse response = nodeClient.execute(MockAnomalyDetectorJobAction.INSTANCE, request).actionGet(100000); + JobResponse response = nodeClient.execute(MockAnomalyDetectorJobAction.INSTANCE, request).actionGet(100000); waitUntil(() -> { try { ADTask task = getADTask(response.getId()); @@ -177,9 +179,9 @@ public void testStartHistoricalAnalysisForSingleCategoryHCWithUser() throws IOEx ADTask adTask = getADTask(response.getId()); assertEquals(ADTaskType.HISTORICAL_HC_DETECTOR.toString(), adTask.getTaskType()); assertTrue(HISTORICAL_ANALYSIS_FINISHED_FAILED_STATS.contains(adTask.getState())); - assertEquals(categoryField, adTask.getDetector().getCategoryField().get(0)); + assertEquals(categoryField, adTask.getDetector().getCategoryFields().get(0)); - if (ADTaskState.FINISHED.name().equals(adTask.getState())) { + if (TaskState.FINISHED.name().equals(adTask.getState())) { List adTasks = searchADTasks(detectorId, true, 100); assertEquals(4, adTasks.size()); List entityTasks = adTasks @@ -217,7 +219,7 @@ public void testStartHistoricalAnalysisForMultiCategoryHCWithUser() throws IOExc Client nodeClient = getDataNodeClient(); if (nodeClient != null) { - AnomalyDetectorJobResponse response = nodeClient.execute(MockAnomalyDetectorJobAction.INSTANCE, request).actionGet(100_000); + JobResponse response = nodeClient.execute(MockAnomalyDetectorJobAction.INSTANCE, request).actionGet(100_000); String taskId = response.getId(); waitUntil(() -> { @@ -232,10 +234,10 @@ public void testStartHistoricalAnalysisForMultiCategoryHCWithUser() throws IOExc assertEquals(ADTaskType.HISTORICAL_HC_DETECTOR.toString(), adTask.getTaskType()); // Task may fail if memory circuit breaker triggered assertTrue(HISTORICAL_ANALYSIS_FINISHED_FAILED_STATS.contains(adTask.getState())); - assertEquals(categoryField, adTask.getDetector().getCategoryField().get(0)); - assertEquals(ipField, adTask.getDetector().getCategoryField().get(1)); + assertEquals(categoryField, adTask.getDetector().getCategoryFields().get(0)); + assertEquals(ipField, adTask.getDetector().getCategoryFields().get(1)); - if (ADTaskState.FINISHED.name().equals(adTask.getState())) { + if (TaskState.FINISHED.name().equals(adTask.getState())) { List adTasks = searchADTasks(detectorId, taskId, true, 100); assertEquals(5, adTasks.size()); List entityTasks = adTasks @@ -252,7 +254,7 @@ public void testRunMultipleTasksForHistoricalAnalysis() throws IOException, Inte .randomDetector(ImmutableList.of(maxValueFeature()), testIndex, detectionIntervalInMinutes, timeField); String detectorId = createDetector(detector); AnomalyDetectorJobRequest request = startDetectorJobRequest(detectorId, dateRange); - AnomalyDetectorJobResponse response = client().execute(AnomalyDetectorJobAction.INSTANCE, request).actionGet(10000); + JobResponse response = client().execute(AnomalyDetectorJobAction.INSTANCE, request).actionGet(10000); assertNotNull(response.getId()); OpenSearchStatusException exception = null; // Add retry to solve the flaky test @@ -296,8 +298,8 @@ public void testRaceConditionByStartingMultipleTasks() throws IOException, Inter List adTasks = searchADTasks(detectorId, null, 100); assertEquals(1, adTasks.size()); - assertTrue(adTasks.get(0).getLatest()); - assertNotEquals(ADTaskState.FAILED.name(), adTasks.get(0).getState()); + assertTrue(adTasks.get(0).isLatest()); + assertNotEquals(TaskState.FAILED.name(), adTasks.get(0).getState()); } // TODO: fix this flaky test case @@ -308,12 +310,12 @@ public void testCleanOldTaskDocs() throws InterruptedException, IOException { String detectorId = createDetector(detector); createDetectionStateIndex(); - List states = ImmutableList.of(ADTaskState.FAILED, ADTaskState.FINISHED, ADTaskState.STOPPED); - for (ADTaskState state : states) { + List states = ImmutableList.of(TaskState.FAILED, TaskState.FINISHED, TaskState.STOPPED); + for (TaskState state : states) { ADTask task = randomADTask(randomAlphaOfLength(5), detector, detectorId, dateRange, state); createADTask(task); } - long count = countDocs(CommonName.DETECTION_STATE_INDEX); + long count = countDocs(ADCommonName.DETECTION_STATE_INDEX); assertEquals(states.size(), count); AnomalyDetectorJobRequest request = new AnomalyDetectorJobRequest( @@ -325,7 +327,7 @@ public void testCleanOldTaskDocs() throws InterruptedException, IOException { START_JOB ); - AtomicReference response = new AtomicReference<>(); + AtomicReference response = new AtomicReference<>(); CountDownLatch latch = new CountDownLatch(1); Thread.sleep(2000); client().execute(AnomalyDetectorJobAction.INSTANCE, request, ActionListener.wrap(r -> { @@ -344,16 +346,16 @@ public void testCleanOldTaskDocs() throws InterruptedException, IOException { public void tearDown() throws Exception { super.tearDown(); // delete index will clear search context, this can avoid in-flight contexts error - deleteIndexIfExists(AnomalyDetector.ANOMALY_DETECTORS_INDEX); - deleteIndexIfExists(CommonName.DETECTION_STATE_INDEX); + deleteIndexIfExists(CommonName.CONFIG_INDEX); + deleteIndexIfExists(ADCommonName.DETECTION_STATE_INDEX); } public void testStartRealtimeDetector() throws IOException { List realtimeResult = startRealtimeDetector(); String detectorId = realtimeResult.get(0); String jobId = realtimeResult.get(1); - GetResponse jobDoc = getDoc(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, detectorId); - AnomalyDetectorJob job = toADJob(jobDoc); + GetResponse jobDoc = getDoc(CommonName.JOB_INDEX, detectorId); + Job job = toADJob(jobDoc); assertTrue(job.isEnabled()); assertEquals(detectorId, job.getName()); @@ -368,7 +370,7 @@ private List startRealtimeDetector() throws IOException { .randomDetector(ImmutableList.of(maxValueFeature()), testIndex, detectionIntervalInMinutes, timeField); String detectorId = createDetector(detector); AnomalyDetectorJobRequest request = startDetectorJobRequest(detectorId, null); - AnomalyDetectorJobResponse response = client().execute(AnomalyDetectorJobAction.INSTANCE, request).actionGet(10000); + JobResponse response = client().execute(AnomalyDetectorJobAction.INSTANCE, request).actionGet(10000); String jobId = response.getId(); assertEquals(detectorId, jobId); return ImmutableList.of(detectorId, jobId); @@ -406,7 +408,7 @@ private void testInvalidDetector(AnomalyDetector detector, String error) throws assertEquals(error, exception.getMessage()); } - private AnomalyDetectorJobRequest startDetectorJobRequest(String detectorId, DetectionDateRange dateRange) { + private AnomalyDetectorJobRequest startDetectorJobRequest(String detectorId, DateRange dateRange) { return new AnomalyDetectorJobRequest(detectorId, dateRange, false, UNASSIGNED_SEQ_NO, UNASSIGNED_PRIMARY_TERM, START_JOB); } @@ -421,8 +423,8 @@ public void testStopRealtimeDetector() throws IOException { AnomalyDetectorJobRequest request = stopDetectorJobRequest(detectorId, false); client().execute(AnomalyDetectorJobAction.INSTANCE, request).actionGet(10000); - GetResponse doc = getDoc(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, detectorId); - AnomalyDetectorJob job = toADJob(doc); + GetResponse doc = getDoc(CommonName.JOB_INDEX, detectorId); + Job job = toADJob(doc); assertFalse(job.isEnabled()); assertEquals(detectorId, job.getName()); @@ -430,13 +432,13 @@ public void testStopRealtimeDetector() throws IOException { assertEquals(1, adTasks.size()); assertEquals(ADTaskType.REALTIME_SINGLE_ENTITY.name(), adTasks.get(0).getTaskType()); assertNotEquals(jobId, adTasks.get(0).getTaskId()); - assertEquals(ADTaskState.STOPPED.name(), adTasks.get(0).getState()); + assertEquals(TaskState.STOPPED.name(), adTasks.get(0).getState()); } public void testStopHistoricalDetector() throws IOException, InterruptedException { updateTransientSettings(ImmutableMap.of(BATCH_TASK_PIECE_INTERVAL_SECONDS.getKey(), 5)); ADTask adTask = startHistoricalAnalysis(startTime, endTime); - assertEquals(ADTaskState.INIT.name(), adTask.getState()); + assertEquals(TaskState.INIT.name(), adTask.getState()); assertNull(adTask.getStartedBy()); assertNull(adTask.getUser()); waitUntil(() -> { @@ -446,7 +448,7 @@ public void testStopHistoricalDetector() throws IOException, InterruptedExceptio if (taskRunning) { // It's possible that the task not started on worker node yet. Recancel it to make sure // task cancelled. - AnomalyDetectorJobRequest request = stopDetectorJobRequest(adTask.getDetectorId(), true); + AnomalyDetectorJobRequest request = stopDetectorJobRequest(adTask.getConfigId(), true); client().execute(AnomalyDetectorJobAction.INSTANCE, request).actionGet(10000); } return !taskRunning; @@ -455,13 +457,13 @@ public void testStopHistoricalDetector() throws IOException, InterruptedExceptio } }, 20, TimeUnit.SECONDS); ADTask stoppedTask = getADTask(adTask.getTaskId()); - assertEquals(ADTaskState.STOPPED.name(), stoppedTask.getState()); + assertEquals(TaskState.STOPPED.name(), stoppedTask.getState()); assertEquals(0, getExecutingADTask()); } public void testProfileHistoricalDetector() throws IOException, InterruptedException { ADTask adTask = startHistoricalAnalysis(startTime, endTime); - GetAnomalyDetectorRequest request = taskProfileRequest(adTask.getDetectorId()); + GetAnomalyDetectorRequest request = taskProfileRequest(adTask.getConfigId()); GetAnomalyDetectorResponse response = client().execute(GetAnomalyDetectorAction.INSTANCE, request).actionGet(10000); assertTrue(response.getDetectorProfile().getAdTaskProfile() != null); @@ -478,7 +480,7 @@ public void testProfileHistoricalDetector() throws IOException, InterruptedExcep assertNull(response.getDetectorProfile().getAdTaskProfile().getNodeId()); ADTask profileAdTask = response.getDetectorProfile().getAdTaskProfile().getAdTask(); assertEquals(finishedTask.getTaskId(), profileAdTask.getTaskId()); - assertEquals(finishedTask.getDetectorId(), profileAdTask.getDetectorId()); + assertEquals(finishedTask.getConfigId(), profileAdTask.getConfigId()); assertEquals(finishedTask.getDetector(), profileAdTask.getDetector()); assertEquals(finishedTask.getState(), profileAdTask.getState()); } @@ -487,8 +489,8 @@ public void testProfileWithMultipleRunningTask() throws IOException { ADTask adTask1 = startHistoricalAnalysis(startTime, endTime); ADTask adTask2 = startHistoricalAnalysis(startTime, endTime); - GetAnomalyDetectorRequest request1 = taskProfileRequest(adTask1.getDetectorId()); - GetAnomalyDetectorRequest request2 = taskProfileRequest(adTask2.getDetectorId()); + GetAnomalyDetectorRequest request1 = taskProfileRequest(adTask1.getConfigId()); + GetAnomalyDetectorRequest request2 = taskProfileRequest(adTask2.getConfigId()); GetAnomalyDetectorResponse response1 = client().execute(GetAnomalyDetectorAction.INSTANCE, request1).actionGet(10000); GetAnomalyDetectorResponse response2 = client().execute(GetAnomalyDetectorAction.INSTANCE, request2).actionGet(10000); ADTaskProfile taskProfile1 = response1.getDetectorProfile().getAdTaskProfile(); diff --git a/src/test/java/org/opensearch/ad/transport/AnomalyResultTests.java b/src/test/java/org/opensearch/ad/transport/AnomalyResultTests.java index 05916e0ca..1b23b6d51 100644 --- a/src/test/java/org/opensearch/ad/transport/AnomalyResultTests.java +++ b/src/test/java/org/opensearch/ad/transport/AnomalyResultTests.java @@ -20,6 +20,7 @@ import static org.hamcrest.Matchers.nullValue; import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.anyString; +import static org.mockito.ArgumentMatchers.eq; import static org.mockito.Mockito.anyDouble; import static org.mockito.Mockito.anyLong; import static org.mockito.Mockito.doAnswer; @@ -33,8 +34,8 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.TestHelpers.createIndexBlockedState; import static org.opensearch.common.xcontent.XContentFactory.jsonBuilder; +import static org.opensearch.timeseries.TestHelpers.createIndexBlockedState; import java.io.IOException; import java.time.Instant; @@ -64,34 +65,21 @@ import org.opensearch.action.index.IndexResponse; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.PlainActionFuture; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.cluster.HashRing; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.common.exception.InternalFailure; import org.opensearch.ad.common.exception.JsonPathNotFoundException; -import org.opensearch.ad.common.exception.LimitExceededException; -import org.opensearch.ad.common.exception.ResourceNotFoundException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.feature.FeatureManager; import org.opensearch.ad.feature.SinglePointFeatures; import org.opensearch.ad.ml.ModelManager; -import org.opensearch.ad.ml.SingleStreamModelIdMapper; import org.opensearch.ad.ml.ThresholdingResult; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.DetectorInternalState; -import org.opensearch.ad.model.FeatureData; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.stats.ADStat; import org.opensearch.ad.stats.ADStats; -import org.opensearch.ad.stats.StatNames; import org.opensearch.ad.stats.suppliers.CounterSupplier; import org.opensearch.ad.task.ADTaskManager; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.ClusterState; @@ -115,6 +103,22 @@ import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.index.IndexNotFoundException; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.InternalFailure; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.common.exception.ResourceNotFoundException; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.ml.SingleStreamModelIdMapper; +import org.opensearch.timeseries.model.FeatureData; +import org.opensearch.timeseries.stats.StatNames; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.NodeNotConnectedException; import org.opensearch.transport.RemoteTransportException; import org.opensearch.transport.Transport; @@ -129,7 +133,7 @@ import test.org.opensearch.ad.util.JsonDeserializer; -public class AnomalyResultTests extends AbstractADTest { +public class AnomalyResultTests extends AbstractTimeSeriesTest { private Settings settings; private TransportService transportService; private ClusterService clusterService; @@ -145,7 +149,7 @@ public class AnomalyResultTests extends AbstractADTest { private String adID; private String featureId; private String featureName; - private ADCircuitBreakerService adCircuitBreakerService; + private CircuitBreakerService adCircuitBreakerService; private ADStats adStats; private double confidence; private double anomalyGrade; @@ -168,7 +172,7 @@ public void setUp() throws Exception { super.setUp(); super.setUpLog4jForJUnit(AnomalyResultTransportAction.class); - setupTestNodes(AnomalyDetectorSettings.MAX_ENTITIES_PER_QUERY, AnomalyDetectorSettings.PAGE_SIZE); + setupTestNodes(AnomalyDetectorSettings.AD_MAX_ENTITIES_PER_QUERY, AnomalyDetectorSettings.AD_PAGE_SIZE); transportService = testNodes[0].transportService; clusterService = testNodes[0].clusterService; @@ -188,14 +192,14 @@ public void setUp() throws Exception { userIndex.add("test*"); when(detector.getIndices()).thenReturn(userIndex); adID = "123"; - when(detector.getDetectorId()).thenReturn(adID); - when(detector.getCategoryField()).thenReturn(null); + when(detector.getId()).thenReturn(adID); + when(detector.getCategoryFields()).thenReturn(null); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(stateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); - when(detector.getDetectorIntervalInMinutes()).thenReturn(1L); + }).when(stateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); + when(detector.getIntervalInMinutes()).thenReturn(1L); hashRing = mock(HashRing.class); Optional localNode = Optional.of(clusterService.state().nodes().getLocalNode()); @@ -249,7 +253,7 @@ public void setUp() throws Exception { thresholdModelID = SingleStreamModelIdMapper.getThresholdModelId(adID); // "123-threshold"; // when(normalModelPartitioner.getThresholdModelId(any(String.class))).thenReturn(thresholdModelID); - adCircuitBreakerService = mock(ADCircuitBreakerService.class); + adCircuitBreakerService = mock(CircuitBreakerService.class); when(adCircuitBreakerService.isOpen()).thenReturn(false); ThreadPool threadPool = mock(ThreadPool.class); @@ -274,7 +278,7 @@ public void setUp() throws Exception { } assertTrue(request != null && listener != null); - ShardId shardId = new ShardId(new Index(CommonName.ANOMALY_RESULT_INDEX_ALIAS, randomAlphaOfLength(10)), 0); + ShardId shardId = new ShardId(new Index(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, randomAlphaOfLength(10)), 0); listener.onResponse(new IndexResponse(shardId, request.id(), 1, 1, 1, true)); return null; @@ -300,12 +304,11 @@ public void setUp() throws Exception { GetRequest request = (GetRequest) args[0]; ActionListener listener = (ActionListener) args[1]; - if (request.index().equals(CommonName.DETECTION_STATE_INDEX)) { + if (request.index().equals(ADCommonName.DETECTION_STATE_INDEX)) { DetectorInternalState.Builder result = new DetectorInternalState.Builder().lastUpdateTime(Instant.now()); - listener - .onResponse(TestHelpers.createGetResponse(result.build(), detector.getDetectorId(), CommonName.DETECTION_STATE_INDEX)); + listener.onResponse(TestHelpers.createGetResponse(result.build(), detector.getId(), ADCommonName.DETECTION_STATE_INDEX)); } @@ -457,8 +460,8 @@ public void sendRequest( setupTestNodes( failureTransportInterceptor, Settings.EMPTY, - AnomalyDetectorSettings.MAX_ENTITIES_PER_QUERY, - AnomalyDetectorSettings.PAGE_SIZE + AnomalyDetectorSettings.AD_MAX_ENTITIES_PER_QUERY, + AnomalyDetectorSettings.AD_PAGE_SIZE ); // mock hashing ring response. This has to happen after setting up test nodes with the failure interceptor @@ -518,7 +521,7 @@ public void testInsufficientCapacityExceptionDuringColdStart() { .getTRcfResult(any(String.class), any(String.class), any(double[].class), any(ActionListener.class)); when(stateManager.fetchExceptionAndClear(any(String.class))) - .thenReturn(Optional.of(new LimitExceededException(adID, CommonErrorMessages.MEMORY_LIMIT_EXCEEDED_ERR_MSG))); + .thenReturn(Optional.of(new LimitExceededException(adID, CommonMessages.MEMORY_LIMIT_EXCEEDED_ERR_MSG))); // These constructors register handler in transport service new RCFResultTransportAction( @@ -561,7 +564,7 @@ public void testInsufficientCapacityExceptionDuringColdStart() { public void testInsufficientCapacityExceptionDuringRestoringModel() { ModelManager rcfManager = mock(ModelManager.class); - doThrow(new NotSerializableExceptionWrapper(new LimitExceededException(adID, CommonErrorMessages.MEMORY_LIMIT_EXCEEDED_ERR_MSG))) + doThrow(new NotSerializableExceptionWrapper(new LimitExceededException(adID, CommonMessages.MEMORY_LIMIT_EXCEEDED_ERR_MSG))) .when(rcfManager) .getTRcfResult(any(String.class), any(String.class), any(double[].class), any(ActionListener.class)); @@ -680,8 +683,8 @@ public void sendRequest( setupTestNodes( failureTransportInterceptor, Settings.EMPTY, - AnomalyDetectorSettings.MAX_ENTITIES_PER_QUERY, - AnomalyDetectorSettings.PAGE_SIZE + AnomalyDetectorSettings.AD_MAX_ENTITIES_PER_QUERY, + AnomalyDetectorSettings.AD_PAGE_SIZE ); // mock hashing ring response. This has to happen after setting up test nodes with the failure interceptor @@ -732,7 +735,7 @@ public void sendRequest( public void testCircuitBreaker() { - ADCircuitBreakerService breakerService = mock(ADCircuitBreakerService.class); + CircuitBreakerService breakerService = mock(CircuitBreakerService.class); when(breakerService.isOpen()).thenReturn(true); // These constructors register handler in transport service @@ -839,7 +842,7 @@ private void nodeNotConnectedExceptionTemplate(boolean isRCF, boolean temporary, PlainActionFuture listener = new PlainActionFuture<>(); action.doExecute(null, request, listener); - assertException(listener, AnomalyDetectionException.class); + assertException(listener, TimeSeriesException.class); if (!temporary) { verify(hashRing, times(numberOfBuildCall)).buildCirclesForRealtimeAD(); @@ -865,10 +868,10 @@ public void testMute() { NodeStateManager muteStateManager = mock(NodeStateManager.class); when(muteStateManager.isMuted(any(String.class), any(String.class))).thenReturn(true); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(muteStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(muteStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); AnomalyResultTransportAction action = new AnomalyResultTransportAction( new ActionFilters(Collections.emptySet()), transportService, @@ -891,7 +894,7 @@ public void testMute() { PlainActionFuture listener = new PlainActionFuture<>(); action.doExecute(null, request, listener); - Throwable exception = assertException(listener, AnomalyDetectionException.class); + Throwable exception = assertException(listener, TimeSeriesException.class); assertThat(exception.getMessage(), containsString(AnomalyResultTransportAction.NODE_UNRESPONSIVE_ERR_MSG)); } @@ -1062,29 +1065,29 @@ public void testJsonRequest() throws IOException, JsonPathNotFoundException { request.toXContent(builder, ToXContent.EMPTY_PARAMS); String json = builder.toString(); - assertEquals(JsonDeserializer.getTextValue(json, CommonName.ID_JSON_KEY), request.getAdID()); + assertEquals(JsonDeserializer.getTextValue(json, ADCommonName.ID_JSON_KEY), request.getAdID()); assertEquals(JsonDeserializer.getLongValue(json, CommonName.START_JSON_KEY), request.getStart()); assertEquals(JsonDeserializer.getLongValue(json, CommonName.END_JSON_KEY), request.getEnd()); } public void testEmptyID() { ActionRequestValidationException e = new AnomalyResultRequest("", 100, 200).validate(); - assertThat(e.validationErrors(), hasItem(CommonErrorMessages.AD_ID_MISSING_MSG)); + assertThat(e.validationErrors(), hasItem(ADCommonMessages.AD_ID_MISSING_MSG)); } public void testZeroStartTime() { ActionRequestValidationException e = new AnomalyResultRequest(adID, 0, 200).validate(); - assertThat(e.validationErrors(), hasItem(startsWith(CommonErrorMessages.INVALID_TIMESTAMP_ERR_MSG))); + assertThat(e.validationErrors(), hasItem(startsWith(CommonMessages.INVALID_TIMESTAMP_ERR_MSG))); } public void testNegativeEndTime() { ActionRequestValidationException e = new AnomalyResultRequest(adID, 0, -200).validate(); - assertThat(e.validationErrors(), hasItem(startsWith(CommonErrorMessages.INVALID_TIMESTAMP_ERR_MSG))); + assertThat(e.validationErrors(), hasItem(startsWith(CommonMessages.INVALID_TIMESTAMP_ERR_MSG))); } public void testNegativeTime() { ActionRequestValidationException e = new AnomalyResultRequest(adID, 10, -200).validate(); - assertThat(e.validationErrors(), hasItem(startsWith(CommonErrorMessages.INVALID_TIMESTAMP_ERR_MSG))); + assertThat(e.validationErrors(), hasItem(startsWith(CommonMessages.INVALID_TIMESTAMP_ERR_MSG))); } // no exception should be thrown @@ -1354,7 +1357,7 @@ public void featureTestTemplate(FeatureTestMode mode) throws IOException { .when(featureQuery) .getCurrentFeatures(any(AnomalyDetector.class), anyLong(), anyLong(), any(ActionListener.class)); } else if (mode == FeatureTestMode.AD_EXCEPTION) { - doThrow(AnomalyDetectionException.class) + doThrow(TimeSeriesException.class) .when(featureQuery) .getCurrentFeatures(any(AnomalyDetector.class), anyLong(), anyLong(), any(ActionListener.class)); } @@ -1471,7 +1474,7 @@ private void globalBlockTemplate(BlockType type, String errLogMsg, Settings inde PlainActionFuture listener = new PlainActionFuture<>(); action.doExecute(null, request, listener); - assertException(listener, AnomalyDetectionException.class, errLogMsg); + assertException(listener, TimeSeriesException.class, errLogMsg); } private void globalBlockTemplate(BlockType type, String errLogMsg) { @@ -1595,10 +1598,10 @@ public void testNullPointerRCFResult() { @SuppressWarnings("unchecked") public void testAllFeaturesDisabled() throws IOException { doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); - listener.onFailure(new EndRunException(adID, CommonErrorMessages.ALL_FEATURES_DISABLED_ERR_MSG, true)); + ActionListener> listener = invocation.getArgument(2); + listener.onFailure(new EndRunException(adID, CommonMessages.ALL_FEATURES_DISABLED_ERR_MSG, true)); return null; - }).when(stateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(stateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); AnomalyResultTransportAction action = new AnomalyResultTransportAction( new ActionFilters(Collections.emptySet()), @@ -1623,7 +1626,7 @@ public void testAllFeaturesDisabled() throws IOException { PlainActionFuture listener = new PlainActionFuture<>(); action.doExecute(null, request, listener); - assertException(listener, EndRunException.class, CommonErrorMessages.ALL_FEATURES_DISABLED_ERR_MSG); + assertException(listener, EndRunException.class, CommonMessages.ALL_FEATURES_DISABLED_ERR_MSG); } @SuppressWarnings("unchecked") @@ -1635,7 +1638,7 @@ public void testEndRunDueToNoTrainingData() { doAnswer(invocation -> { Object[] args = invocation.getArguments(); ActionListener listener = (ActionListener) args[3]; - listener.onFailure(new IndexNotFoundException(CommonName.CHECKPOINT_INDEX_NAME)); + listener.onFailure(new IndexNotFoundException(ADCommonName.CHECKPOINT_INDEX_NAME)); return null; }).when(rcfManager).getTRcfResult(any(String.class), any(String.class), any(double[].class), any(ActionListener.class)); @@ -1709,7 +1712,7 @@ public void testColdStartEndRunException() { .of( new EndRunException( adID, - CommonErrorMessages.INVALID_SEARCH_QUERY_MSG, + CommonMessages.INVALID_SEARCH_QUERY_MSG, new NoSuchElementException("No value present"), false ) @@ -1738,7 +1741,7 @@ public void testColdStartEndRunException() { ); action.doExecute(null, request, listener); - assertException(listener, EndRunException.class, CommonErrorMessages.INVALID_SEARCH_QUERY_MSG); + assertException(listener, EndRunException.class, CommonMessages.INVALID_SEARCH_QUERY_MSG); verify(featureQuery, times(1)).getColdStartData(any(AnomalyDetector.class), any(ActionListener.class)); } @@ -1753,7 +1756,7 @@ public void testColdStartEndRunExceptionNow() { .of( new EndRunException( adID, - CommonErrorMessages.INVALID_SEARCH_QUERY_MSG, + CommonMessages.INVALID_SEARCH_QUERY_MSG, new NoSuchElementException("No value present"), true ) @@ -1782,7 +1785,7 @@ public void testColdStartEndRunExceptionNow() { ); action.doExecute(null, request, listener); - assertException(listener, EndRunException.class, CommonErrorMessages.INVALID_SEARCH_QUERY_MSG); + assertException(listener, EndRunException.class, CommonMessages.INVALID_SEARCH_QUERY_MSG); verify(featureQuery, never()).getColdStartData(any(AnomalyDetector.class), any(ActionListener.class)); } @@ -1791,7 +1794,7 @@ public void testColdStartBecauseFailtoGetCheckpoint() { ThreadPool mockThreadPool = mock(ThreadPool.class); setUpColdStart( mockThreadPool, - new ColdStartConfig.Builder().getCheckpointException(new IndexNotFoundException(CommonName.CHECKPOINT_INDEX_NAME)).build() + new ColdStartConfig.Builder().getCheckpointException(new IndexNotFoundException(ADCommonName.CHECKPOINT_INDEX_NAME)).build() ); doAnswer(invocation -> { diff --git a/src/test/java/org/opensearch/ad/transport/AnomalyResultTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/AnomalyResultTransportActionTests.java index 74bacbfeb..78ffca8dd 100644 --- a/src/test/java/org/opensearch/ad/transport/AnomalyResultTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/AnomalyResultTransportActionTests.java @@ -11,7 +11,7 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.TestHelpers.randomQuery; +import static org.opensearch.timeseries.TestHelpers.randomQuery; import java.io.IOException; import java.time.Instant; @@ -24,15 +24,15 @@ import org.junit.Before; import org.opensearch.action.get.GetResponse; import org.opensearch.ad.ADIntegTestCase; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.common.exception.AnomalyDetectionException; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Feature; -import org.opensearch.ad.model.IntervalTimeConfiguration; -import org.opensearch.ad.util.ExceptionUtil; import org.opensearch.core.common.io.stream.NotSerializableExceptionWrapper; import org.opensearch.search.aggregations.AggregationBuilder; import org.opensearch.test.rest.OpenSearchRestTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.util.ExceptionUtil; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -219,7 +219,8 @@ private AnomalyDetector randomDetector(List indices, List featu Instant.now(), null, null, - null + null, + TestHelpers.randomImputationOption() ); } @@ -241,7 +242,8 @@ private AnomalyDetector randomHCDetector(List indices, List fea Instant.now(), ImmutableList.of(categoryField), null, - null + null, + TestHelpers.randomImputationOption() ); } @@ -268,7 +270,7 @@ private void assertErrorMessage(String adId, String errorMessage, boolean hcDete } } else { e = expectThrowsAnyOf( - ImmutableList.of(NotSerializableExceptionWrapper.class, AnomalyDetectionException.class), + ImmutableList.of(NotSerializableExceptionWrapper.class, TimeSeriesException.class), () -> client().execute(AnomalyResultAction.INSTANCE, resultRequest).actionGet(30_000) ); } diff --git a/src/test/java/org/opensearch/ad/transport/CronTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/CronTransportActionTests.java index 040a7bbb4..7c3de7ed2 100644 --- a/src/test/java/org/opensearch/ad/transport/CronTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/CronTransportActionTests.java @@ -21,11 +21,8 @@ import org.junit.Assert; import org.junit.Before; -import org.junit.BeforeClass; import org.opensearch.Version; import org.opensearch.action.support.ActionFilters; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.NodeStateManager; import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.caching.EntityCache; import org.opensearch.ad.common.exception.JsonPathNotFoundException; @@ -33,7 +30,6 @@ import org.opensearch.ad.ml.EntityColdStarter; import org.opensearch.ad.ml.ModelManager; import org.opensearch.ad.task.ADTaskManager; -import org.opensearch.ad.util.Bwc; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.service.ClusterService; @@ -43,21 +39,18 @@ import org.opensearch.core.xcontent.ToXContent; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.NodeStateManager; import org.opensearch.transport.TransportService; import com.google.gson.JsonElement; import test.org.opensearch.ad.util.JsonDeserializer; -public class CronTransportActionTests extends AbstractADTest { +public class CronTransportActionTests extends AbstractTimeSeriesTest { private CronTransportAction action; private String localNodeID; - @BeforeClass - public static void setUpBeforeClass() { - Bwc.DISABLE_BWC = false; - } - @Override @Before public void setUp() throws Exception { @@ -99,10 +92,10 @@ public void testNormal() throws IOException, JsonPathNotFoundException { CronNodeRequest nodeRequest = new CronNodeRequest(); BytesStreamOutput nodeRequestOut = new BytesStreamOutput(); - nodeRequestOut.setVersion(Version.V_1_0_0); + nodeRequestOut.setVersion(Version.V_2_0_0); nodeRequest.writeTo(nodeRequestOut); StreamInput siNode = nodeRequestOut.bytes().streamInput(); - siNode.setVersion(Version.V_1_0_0); + siNode.setVersion(Version.V_2_0_0); CronNodeRequest nodeResponseRead = new CronNodeRequest(siNode); diff --git a/src/test/java/org/opensearch/ad/transport/DeleteAnomalyDetectorActionTests.java b/src/test/java/org/opensearch/ad/transport/DeleteAnomalyDetectorActionTests.java index c26da5a21..93e291325 100644 --- a/src/test/java/org/opensearch/ad/transport/DeleteAnomalyDetectorActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/DeleteAnomalyDetectorActionTests.java @@ -49,7 +49,7 @@ public void setUp() throws Exception { ClusterService clusterService = mock(ClusterService.class); ClusterSettings clusterSettings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES))) ); when(clusterService.getClusterSettings()).thenReturn(clusterSettings); adTaskManager = mock(ADTaskManager.class); diff --git a/src/test/java/org/opensearch/ad/transport/DeleteAnomalyDetectorTests.java b/src/test/java/org/opensearch/ad/transport/DeleteAnomalyDetectorTests.java index 6b9b08dfa..03608048f 100644 --- a/src/test/java/org/opensearch/ad/transport/DeleteAnomalyDetectorTests.java +++ b/src/test/java/org/opensearch/ad/transport/DeleteAnomalyDetectorTests.java @@ -13,7 +13,6 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; import static org.opensearch.index.seqno.SequenceNumbers.UNASSIGNED_SEQ_NO; import java.time.Instant; @@ -35,13 +34,8 @@ import org.opensearch.action.get.GetResponse; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.PlainActionFuture; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.model.IntervalTimeConfiguration; -import org.opensearch.ad.rest.handler.AnomalyDetectorFunction; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.task.ADTaskManager; import org.opensearch.client.Client; @@ -59,10 +53,16 @@ import org.opensearch.jobscheduler.spi.schedule.IntervalSchedule; import org.opensearch.tasks.Task; import org.opensearch.telemetry.tracing.noop.NoopTracer; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.Job; import org.opensearch.transport.Transport; import org.opensearch.transport.TransportService; -public class DeleteAnomalyDetectorTests extends AbstractADTest { +public class DeleteAnomalyDetectorTests extends AbstractTimeSeriesTest { private DeleteAnomalyDetectorTransportAction action; private TransportService transportService; private ActionFilters actionFilters; @@ -72,7 +72,7 @@ public class DeleteAnomalyDetectorTests extends AbstractADTest { private DeleteResponse deleteResponse; private GetResponse getResponse; ClusterService clusterService; - private AnomalyDetectorJob jobParameter; + private Job jobParameter; @BeforeClass public static void setUpBeforeClass() { @@ -90,7 +90,7 @@ public void setUp() throws Exception { clusterService = mock(ClusterService.class); ClusterSettings clusterSettings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES))) ); when(clusterService.getClusterSettings()).thenReturn(clusterSettings); transportService = new TransportService( @@ -119,7 +119,7 @@ public void setUp() throws Exception { adTaskManager ); - jobParameter = mock(AnomalyDetectorJob.class); + jobParameter = mock(Job.class); when(jobParameter.getName()).thenReturn(randomAlphaOfLength(10)); IntervalSchedule schedule = new IntervalSchedule(Instant.now(), 1, ChronoUnit.MINUTES); when(jobParameter.getSchedule()).thenReturn(schedule); @@ -202,7 +202,7 @@ private ClusterState createClusterState() { Map immutableOpenMap = new HashMap<>(); immutableOpenMap .put( - ANOMALY_DETECTOR_JOB_INDEX, + CommonName.JOB_INDEX, IndexMetadata .builder("test") .settings( @@ -250,7 +250,7 @@ private void setupMocks( doAnswer(invocation -> { Object[] args = invocation.getArguments(); - AnomalyDetectorFunction function = (AnomalyDetectorFunction) args[1]; + ExecutorFunction function = (ExecutorFunction) args[1]; function.execute(); return null; @@ -282,7 +282,7 @@ private void setupMocks( } getResponse = new GetResponse( new GetResult( - AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX, + CommonName.JOB_INDEX, "id", UNASSIGNED_SEQ_NO, 0, @@ -290,7 +290,7 @@ private void setupMocks( true, BytesReference .bytes( - new AnomalyDetectorJob( + new Job( "1234", jobParameter.getSchedule(), jobParameter.getWindowDelay(), @@ -300,7 +300,7 @@ private void setupMocks( Instant.now(), 60L, TestHelpers.randomUser(), - jobParameter.getResultIndex() + jobParameter.getCustomResultIndex() ).toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS) ), Collections.emptyMap(), diff --git a/src/test/java/org/opensearch/ad/transport/DeleteAnomalyDetectorTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/DeleteAnomalyDetectorTransportActionTests.java index 5712d8e48..ac81ecf25 100644 --- a/src/test/java/org/opensearch/ad/transport/DeleteAnomalyDetectorTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/DeleteAnomalyDetectorTransportActionTests.java @@ -18,10 +18,10 @@ import org.junit.Before; import org.opensearch.action.delete.DeleteResponse; import org.opensearch.ad.HistoricalAnalysisIntegTestCase; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Feature; import org.opensearch.test.OpenSearchIntegTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.Feature; import com.google.common.collect.ImmutableList; diff --git a/src/test/java/org/opensearch/ad/transport/DeleteAnomalyResultsTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/DeleteAnomalyResultsTransportActionTests.java index 76e5eb0dc..5653a577c 100644 --- a/src/test/java/org/opensearch/ad/transport/DeleteAnomalyResultsTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/DeleteAnomalyResultsTransportActionTests.java @@ -11,8 +11,8 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.TestHelpers.matchAllRequest; -import static org.opensearch.ad.constant.CommonName.ANOMALY_RESULT_INDEX_ALIAS; +import static org.opensearch.ad.constant.ADCommonName.ANOMALY_RESULT_INDEX_ALIAS; +import static org.opensearch.timeseries.TestHelpers.matchAllRequest; import java.io.IOException; import java.util.concurrent.TimeUnit; @@ -20,11 +20,11 @@ import org.junit.Ignore; import org.opensearch.action.search.SearchResponse; import org.opensearch.ad.HistoricalAnalysisIntegTestCase; -import org.opensearch.ad.TestHelpers; import org.opensearch.index.query.BoolQueryBuilder; import org.opensearch.index.query.MatchAllQueryBuilder; import org.opensearch.index.reindex.BulkByScrollResponse; import org.opensearch.index.reindex.DeleteByQueryRequest; +import org.opensearch.timeseries.TestHelpers; public class DeleteAnomalyResultsTransportActionTests extends HistoricalAnalysisIntegTestCase { diff --git a/src/test/java/org/opensearch/ad/transport/DeleteITTests.java b/src/test/java/org/opensearch/ad/transport/DeleteITTests.java index 222f5434d..1a57504cc 100644 --- a/src/test/java/org/opensearch/ad/transport/DeleteITTests.java +++ b/src/test/java/org/opensearch/ad/transport/DeleteITTests.java @@ -17,19 +17,19 @@ import org.opensearch.action.ActionRequestValidationException; import org.opensearch.ad.ADIntegTestCase; -import org.opensearch.ad.AnomalyDetectorPlugin; import org.opensearch.common.action.ActionFuture; import org.opensearch.plugins.Plugin; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; public class DeleteITTests extends ADIntegTestCase { @Override protected Collection> nodePlugins() { - return Collections.singletonList(AnomalyDetectorPlugin.class); + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); } protected Collection> transportClientPlugins() { - return Collections.singletonList(AnomalyDetectorPlugin.class); + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); } public void testNormalStopDetector() throws ExecutionException, InterruptedException { diff --git a/src/test/java/org/opensearch/ad/transport/DeleteModelTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/DeleteModelTransportActionTests.java index 3866ab962..b76925492 100644 --- a/src/test/java/org/opensearch/ad/transport/DeleteModelTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/DeleteModelTransportActionTests.java @@ -27,12 +27,10 @@ import org.opensearch.Version; import org.opensearch.action.ActionRequestValidationException; import org.opensearch.action.support.ActionFilters; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.NodeStateManager; import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.caching.EntityCache; import org.opensearch.ad.common.exception.JsonPathNotFoundException; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.feature.FeatureManager; import org.opensearch.ad.ml.EntityColdStarter; import org.opensearch.ad.ml.ModelManager; @@ -46,13 +44,15 @@ import org.opensearch.core.xcontent.ToXContent; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.NodeStateManager; import org.opensearch.transport.TransportService; import com.google.gson.JsonElement; import test.org.opensearch.ad.util.JsonDeserializer; -public class DeleteModelTransportActionTests extends AbstractADTest { +public class DeleteModelTransportActionTests extends AbstractTimeSeriesTest { private DeleteModelTransportAction action; private String localNodeID; @@ -134,6 +134,6 @@ public void testNormal() throws IOException, JsonPathNotFoundException { public void testEmptyDetectorID() { ActionRequestValidationException e = new DeleteModelRequest().validate(); - assertThat(e.validationErrors(), Matchers.hasItem(CommonErrorMessages.AD_ID_MISSING_MSG)); + assertThat(e.validationErrors(), Matchers.hasItem(ADCommonMessages.AD_ID_MISSING_MSG)); } } diff --git a/src/test/java/org/opensearch/ad/transport/DeleteTests.java b/src/test/java/org/opensearch/ad/transport/DeleteTests.java index ffc02aac4..4821cbfbd 100644 --- a/src/test/java/org/opensearch/ad/transport/DeleteTests.java +++ b/src/test/java/org/opensearch/ad/transport/DeleteTests.java @@ -39,11 +39,9 @@ import org.opensearch.action.FailedNodeException; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.PlainActionFuture; -import org.opensearch.ad.AbstractADTest; import org.opensearch.ad.common.exception.JsonPathNotFoundException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.util.DiscoveryNodeFilterer; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.client.Client; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.node.DiscoveryNode; @@ -59,12 +57,14 @@ import org.opensearch.index.reindex.BulkByScrollResponse; import org.opensearch.tasks.Task; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; import org.opensearch.transport.TransportService; import test.org.opensearch.ad.util.ClusterCreation; import test.org.opensearch.ad.util.JsonDeserializer; -public class DeleteTests extends AbstractADTest { +public class DeleteTests extends AbstractTimeSeriesTest { private DeleteModelResponse response; private List failures; private List deleteModelResponse; @@ -148,12 +148,12 @@ public void testSerialzationResponse() throws IOException { public void testEmptyIDDeleteModel() { ActionRequestValidationException e = new DeleteModelRequest("").validate(); - assertThat(e.validationErrors(), Matchers.hasItem(CommonErrorMessages.AD_ID_MISSING_MSG)); + assertThat(e.validationErrors(), Matchers.hasItem(ADCommonMessages.AD_ID_MISSING_MSG)); } public void testEmptyIDStopDetector() { ActionRequestValidationException e = new StopDetectorRequest().validate(); - assertThat(e.validationErrors(), hasItem(CommonErrorMessages.AD_ID_MISSING_MSG)); + assertThat(e.validationErrors(), hasItem(ADCommonMessages.AD_ID_MISSING_MSG)); } public void testValidIDStopDetector() { @@ -185,7 +185,7 @@ public void testJsonRequestTemplate(R request, Supplier EntityProfileName.getName("abc")); - assertEquals(exception.getMessage(), CommonErrorMessages.UNSUPPORTED_PROFILE_TYPE); + assertEquals(exception.getMessage(), ADCommonMessages.UNSUPPORTED_PROFILE_TYPE); } } diff --git a/src/test/java/org/opensearch/ad/transport/EntityResultTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/EntityResultTransportActionTests.java index 4257ba49e..f7eb2c8e9 100644 --- a/src/test/java/org/opensearch/ad/transport/EntityResultTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/EntityResultTransportActionTests.java @@ -17,6 +17,7 @@ import static org.mockito.ArgumentMatchers.any; import static org.mockito.ArgumentMatchers.anyInt; import static org.mockito.ArgumentMatchers.anyString; +import static org.mockito.ArgumentMatchers.eq; import static org.mockito.Mockito.doAnswer; import static org.mockito.Mockito.doThrow; import static org.mockito.Mockito.eq; @@ -48,27 +49,20 @@ import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.PlainActionFuture; import org.opensearch.action.support.master.AcknowledgedResponse; -import org.opensearch.ad.AbstractADTest; import org.opensearch.ad.AnomalyDetectorJobRunnerTests; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.caching.EntityCache; -import org.opensearch.ad.common.exception.EndRunException; import org.opensearch.ad.common.exception.JsonPathNotFoundException; -import org.opensearch.ad.common.exception.LimitExceededException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.constant.CommonValue; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.ml.CheckpointDao; import org.opensearch.ad.ml.EntityColdStarter; import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelManager; import org.opensearch.ad.ml.ModelState; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.ratelimit.CheckpointReadWorker; import org.opensearch.ad.ratelimit.ColdEntityWorker; import org.opensearch.ad.ratelimit.EntityColdStartWorker; @@ -76,7 +70,6 @@ import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.stats.ADStat; import org.opensearch.ad.stats.ADStats; -import org.opensearch.ad.stats.StatNames; import org.opensearch.ad.stats.suppliers.CounterSupplier; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.ClusterSettings; @@ -85,6 +78,17 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.core.xcontent.ToXContent; import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.stats.StatNames; import org.opensearch.transport.TransportService; import com.google.gson.JsonArray; @@ -94,12 +98,12 @@ import test.org.opensearch.ad.util.MLUtil; import test.org.opensearch.ad.util.RandomModelStateConfig; -public class EntityResultTransportActionTests extends AbstractADTest { +public class EntityResultTransportActionTests extends AbstractTimeSeriesTest { EntityResultTransportAction entityResult; ActionFilters actionFilters; TransportService transportService; ModelManager manager; - ADCircuitBreakerService adCircuitBreakerService; + CircuitBreakerService adCircuitBreakerService; CheckpointDao checkpointDao; CacheProvider provider; EntityCache entityCache; @@ -128,7 +132,7 @@ public class EntityResultTransportActionTests extends AbstractADTest { EntityColdStarter coldStarter; ColdEntityWorker coldEntityQueue; EntityColdStartWorker entityColdStartQueue; - AnomalyDetectionIndices indexUtil; + ADIndexManagement indexUtil; ClusterService clusterService; ADStats adStats; @@ -150,7 +154,7 @@ public void setUp() throws Exception { actionFilters = mock(ActionFilters.class); transportService = mock(TransportService.class); - adCircuitBreakerService = mock(ADCircuitBreakerService.class); + adCircuitBreakerService = mock(CircuitBreakerService.class); when(adCircuitBreakerService.isOpen()).thenReturn(false); checkpointDao = mock(CheckpointDao.class); @@ -168,14 +172,14 @@ public void setUp() throws Exception { settings = Settings .builder() - .put(AnomalyDetectorSettings.COOLDOWN_MINUTES.getKey(), TimeValue.timeValueMinutes(5)) - .put(AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ.getKey(), TimeValue.timeValueHours(12)) + .put(AnomalyDetectorSettings.AD_COOLDOWN_MINUTES.getKey(), TimeValue.timeValueMinutes(5)) + .put(AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ.getKey(), TimeValue.timeValueHours(12)) .build(); clusterService = mock(ClusterService.class); ClusterSettings clusterSettings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ))) ); when(clusterService.getClusterSettings()).thenReturn(clusterSettings); manager = new ModelManager( @@ -188,7 +192,7 @@ public void setUp() throws Exception { 0, 0, null, - AnomalyDetectorSettings.CHECKPOINT_SAVING_FREQ, + AnomalyDetectorSettings.AD_CHECKPOINT_SAVING_FREQ, mock(EntityColdStarter.class), null, null, @@ -204,22 +208,22 @@ public void setUp() throws Exception { detector = TestHelpers.randomAnomalyDetectorUsingCategoryFields(detectorId, Arrays.asList(field)); stateManager = mock(NodeStateManager.class); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(stateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(stateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); cacheMissEntity = "0.0.0.1"; cacheMissData = new double[] { 0.1 }; cacheHitEntity = "0.0.0.2"; cacheHitData = new double[] { 0.2 }; - cacheMissEntityObj = Entity.createSingleAttributeEntity(detector.getCategoryField().get(0), cacheMissEntity); + cacheMissEntityObj = Entity.createSingleAttributeEntity(detector.getCategoryFields().get(0), cacheMissEntity); entities.put(cacheMissEntityObj, cacheMissData); - cacheHitEntityObj = Entity.createSingleAttributeEntity(detector.getCategoryField().get(0), cacheHitEntity); + cacheHitEntityObj = Entity.createSingleAttributeEntity(detector.getCategoryFields().get(0), cacheHitEntity); entities.put(cacheHitEntityObj, cacheHitData); - tooLongEntity = randomAlphaOfLength(AnomalyDetectorSettings.MAX_ENTITY_LENGTH + 1); + tooLongEntity = randomAlphaOfLength(257); tooLongData = new double[] { 0.3 }; - entities.put(Entity.createSingleAttributeEntity(detector.getCategoryField().get(0), tooLongEntity), tooLongData); + entities.put(Entity.createSingleAttributeEntity(detector.getCategoryFields().get(0), tooLongEntity), tooLongData); ModelState state = MLUtil.randomModelState(new RandomModelStateConfig.Builder().fullModel(true).build()); when(entityCache.get(eq(cacheMissEntityObj.getModelId(detectorId).get()), any())).thenReturn(null); @@ -229,7 +233,7 @@ public void setUp() throws Exception { coldEntities.add(cacheMissEntityObj); when(entityCache.selectUpdateCandidate(any(), anyString(), any())).thenReturn(Pair.of(new ArrayList<>(), coldEntities)); - indexUtil = mock(AnomalyDetectionIndices.class); + indexUtil = mock(ADIndexManagement.class); when(indexUtil.getSchemaVersion(any())).thenReturn(CommonValue.NO_SCHEMA_VERSION); resultWriteQueue = mock(ResultWriteWorker.class); @@ -299,10 +303,10 @@ public void testNormal() { @SuppressWarnings("unchecked") public void testFailtoGetDetector() { doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.empty()); return null; - }).when(stateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(stateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); PlainActionFuture future = PlainActionFuture.newFuture(); @@ -333,19 +337,19 @@ public void testValidRequest() { public void testEmptyId() { request = new EntityResultRequest("", entities, start, end); ActionRequestValidationException e = request.validate(); - assertThat(e.validationErrors(), hasItem(CommonErrorMessages.AD_ID_MISSING_MSG)); + assertThat(e.validationErrors(), hasItem(ADCommonMessages.AD_ID_MISSING_MSG)); } public void testReverseTime() { request = new EntityResultRequest(detectorId, entities, end, start); ActionRequestValidationException e = request.validate(); - assertThat(e.validationErrors(), hasItem(startsWith(CommonErrorMessages.INVALID_TIMESTAMP_ERR_MSG))); + assertThat(e.validationErrors(), hasItem(startsWith(CommonMessages.INVALID_TIMESTAMP_ERR_MSG))); } public void testNegativeTime() { request = new EntityResultRequest(detectorId, entities, start, -end); ActionRequestValidationException e = request.validate(); - assertThat(e.validationErrors(), hasItem(startsWith(CommonErrorMessages.INVALID_TIMESTAMP_ERR_MSG))); + assertThat(e.validationErrors(), hasItem(startsWith(CommonMessages.INVALID_TIMESTAMP_ERR_MSG))); } public void testJsonResponse() throws IOException, JsonPathNotFoundException { @@ -353,7 +357,7 @@ public void testJsonResponse() throws IOException, JsonPathNotFoundException { request.toXContent(builder, ToXContent.EMPTY_PARAMS); String json = builder.toString(); - assertEquals(JsonDeserializer.getTextValue(json, CommonName.ID_JSON_KEY), detectorId); + assertEquals(JsonDeserializer.getTextValue(json, ADCommonName.ID_JSON_KEY), detectorId); assertEquals(JsonDeserializer.getLongValue(json, CommonName.START_JSON_KEY), start); assertEquals(JsonDeserializer.getLongValue(json, CommonName.END_JSON_KEY), end); JsonArray array = JsonDeserializer.getArrayValue(json, CommonName.ENTITIES_JSON_KEY); diff --git a/src/test/java/org/opensearch/ad/transport/ForwardADTaskRequestTests.java b/src/test/java/org/opensearch/ad/transport/ForwardADTaskRequestTests.java index 20cb97805..633a9a4fe 100644 --- a/src/test/java/org/opensearch/ad/transport/ForwardADTaskRequestTests.java +++ b/src/test/java/org/opensearch/ad/transport/ForwardADTaskRequestTests.java @@ -11,12 +11,12 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.TestHelpers.randomIntervalTimeConfiguration; -import static org.opensearch.ad.TestHelpers.randomQuery; -import static org.opensearch.ad.TestHelpers.randomUser; import static org.opensearch.ad.model.ADTaskAction.CLEAN_CACHE; import static org.opensearch.ad.model.ADTaskAction.CLEAN_STALE_RUNNING_ENTITIES; import static org.opensearch.ad.model.ADTaskAction.START; +import static org.opensearch.timeseries.TestHelpers.randomIntervalTimeConfiguration; +import static org.opensearch.timeseries.TestHelpers.randomQuery; +import static org.opensearch.timeseries.TestHelpers.randomUser; import java.io.IOException; import java.time.Instant; @@ -25,20 +25,20 @@ import org.opensearch.Version; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.common.exception.ADVersionException; import org.opensearch.ad.mock.transport.MockADTaskAction_1_0; import org.opensearch.ad.mock.transport.MockForwardADTaskRequest_1_0; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; import org.opensearch.core.common.io.stream.NamedWriteableRegistry; import org.opensearch.plugins.Plugin; import org.opensearch.test.InternalSettingsPlugin; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.common.exception.VersionException; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import com.google.common.collect.ImmutableList; @@ -46,7 +46,7 @@ public class ForwardADTaskRequestTests extends OpenSearchSingleNodeTestCase { @Override protected Collection> getPlugins() { - return pluginList(InternalSettingsPlugin.class, AnomalyDetectorPlugin.class); + return pluginList(InternalSettingsPlugin.class, TimeSeriesAnalyticsPlugin.class); } @Override @@ -54,9 +54,9 @@ protected NamedWriteableRegistry writableRegistry() { return getInstanceFromNode(NamedWriteableRegistry.class); } - public void testUnsupportedVersion() throws IOException { + public void testNullVersion() throws IOException { AnomalyDetector detector = TestHelpers.randomAnomalyDetector(ImmutableList.of()); - expectThrows(ADVersionException.class, () -> new ForwardADTaskRequest(detector, null, null, null, null, Version.V_1_0_0)); + expectThrows(VersionException.class, () -> new ForwardADTaskRequest(detector, null, null, null, null, null)); } public void testNullDetectorIdAndTaskAction() throws IOException { @@ -71,15 +71,16 @@ public void testNullDetectorIdAndTaskAction() throws IOException { randomQuery(), randomIntervalTimeConfiguration(), randomIntervalTimeConfiguration(), - randomIntBetween(1, AnomalyDetectorSettings.MAX_SHINGLE_SIZE), + randomIntBetween(1, TimeSeriesSettings.MAX_SHINGLE_SIZE), null, randomInt(), Instant.now(), null, randomUser(), - null + null, + TestHelpers.randomImputationOption() ); - ForwardADTaskRequest request = new ForwardADTaskRequest(detector, null, null, null, null, Version.V_1_1_0); + ForwardADTaskRequest request = new ForwardADTaskRequest(detector, null, null, null, null, Version.V_2_1_0); ActionRequestValidationException validate = request.validate(); assertEquals("Validation Failed: 1: AD ID is missing;2: AD task action is missing;", validate.getMessage()); } @@ -114,7 +115,7 @@ public void testParseRequestFromOldNodeWithNewCode() throws IOException { // Parse old forward AD task request of 1.0, will reject it directly, // so if old node is coordinating node, it can't use new node as worker node to run task. NamedWriteableAwareStreamInput input = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); - expectThrows(ADVersionException.class, () -> new ForwardADTaskRequest(input)); + expectThrows(VersionException.class, () -> new ForwardADTaskRequest(input)); } public void testParseRequestFromNewNodeWithOldCode_StartAction() throws IOException { diff --git a/src/test/java/org/opensearch/ad/transport/ForwardADTaskTests.java b/src/test/java/org/opensearch/ad/transport/ForwardADTaskTests.java index ac278aebb..982cf9262 100644 --- a/src/test/java/org/opensearch/ad/transport/ForwardADTaskTests.java +++ b/src/test/java/org/opensearch/ad/transport/ForwardADTaskTests.java @@ -18,9 +18,7 @@ import org.junit.Before; import org.opensearch.Version; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonErrorMessages; +import org.opensearch.ad.constant.ADCommonMessages; import org.opensearch.ad.model.ADTaskAction; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; @@ -28,6 +26,8 @@ import org.opensearch.plugins.Plugin; import org.opensearch.test.InternalSettingsPlugin; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; import com.google.common.collect.ImmutableMap; @@ -43,7 +43,7 @@ public void setUp() throws Exception { @Override protected Collection> getPlugins() { - return pluginList(InternalSettingsPlugin.class, AnomalyDetectorPlugin.class); + return pluginList(InternalSettingsPlugin.class, TimeSeriesAnalyticsPlugin.class); } @Override @@ -86,7 +86,7 @@ public void testInvalidForwardADTaskRequest() { ); ActionRequestValidationException exception = request.validate(); - assertTrue(exception.getMessage().contains(CommonErrorMessages.DETECTOR_MISSING)); + assertTrue(exception.getMessage().contains(ADCommonMessages.DETECTOR_MISSING)); } private void testForwardADTaskRequest(ForwardADTaskRequest request) throws IOException { diff --git a/src/test/java/org/opensearch/ad/transport/ForwardADTaskTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/ForwardADTaskTransportActionTests.java index d589313f2..ac83b5c8e 100644 --- a/src/test/java/org/opensearch/ad/transport/ForwardADTaskTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/ForwardADTaskTransportActionTests.java @@ -31,8 +31,6 @@ import org.opensearch.action.support.ActionFilters; import org.opensearch.ad.ADUnitTestCase; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.feature.FeatureManager; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.ADTaskType; @@ -40,6 +38,9 @@ import org.opensearch.ad.task.ADTaskManager; import org.opensearch.core.action.ActionListener; import org.opensearch.tasks.Task; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.transport.JobResponse; import org.opensearch.transport.TransportService; import com.google.common.collect.ImmutableList; @@ -53,7 +54,7 @@ public class ForwardADTaskTransportActionTests extends ADUnitTestCase { private NodeStateManager stateManager; private ForwardADTaskTransportAction forwardADTaskTransportAction; private Task task; - private ActionListener listener; + private ActionListener listener; @SuppressWarnings("unchecked") @Override diff --git a/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorActionTests.java b/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorActionTests.java index 3f51105dc..2a0b677ed 100644 --- a/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorActionTests.java @@ -13,18 +13,12 @@ import java.io.IOException; import java.util.Collection; -import java.util.List; import org.junit.Assert; -import org.junit.Test; import org.mockito.Mockito; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.DetectorProfile; -import org.opensearch.ad.model.Feature; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; import org.opensearch.core.common.io.stream.NamedWriteableRegistry; @@ -33,6 +27,12 @@ import org.opensearch.plugins.Plugin; import org.opensearch.test.InternalSettingsPlugin; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.Job; + +import com.google.common.collect.ImmutableList; /** * Need to extend from OpenSearchSingleNodeTestCase and override getPlugins and writeableRegistry @@ -43,7 +43,7 @@ public class GetAnomalyDetectorActionTests extends OpenSearchSingleNodeTestCase { @Override protected Collection> getPlugins() { - return pluginList(InternalSettingsPlugin.class, AnomalyDetectorPlugin.class); + return pluginList(InternalSettingsPlugin.class, TimeSeriesAnalyticsPlugin.class); } @Override @@ -51,7 +51,6 @@ protected NamedWriteableRegistry writableRegistry() { return getInstanceFromNode(NamedWriteableRegistry.class); } - @Test public void testGetRequest() throws IOException { BytesStreamOutput out = new BytesStreamOutput(); GetAnomalyDetectorRequest request = new GetAnomalyDetectorRequest("1234", 4321, false, false, "nonempty", "", false, null); @@ -62,12 +61,12 @@ public void testGetRequest() throws IOException { } - @Test public void testGetResponse() throws Exception { BytesStreamOutput out = new BytesStreamOutput(); Feature feature = TestHelpers.randomFeature(true); - AnomalyDetector detector = TestHelpers.randomAnomalyDetector(List.of(feature)); - AnomalyDetectorJob detectorJob = Mockito.mock(AnomalyDetectorJob.class); + AnomalyDetector detector = TestHelpers.randomAnomalyDetector(ImmutableList.of(feature)); // Mockito.mock(AnomalyDetector.class); + Job detectorJob = Mockito.mock(Job.class); + // Mockito.doNothing().when(detector).writeTo(out); GetAnomalyDetectorResponse response = new GetAnomalyDetectorResponse( 1234, "4567", @@ -85,8 +84,9 @@ public void testGetResponse() throws Exception { false ); response.writeTo(out); - + // StreamInput input = out.bytes().streamInput(); NamedWriteableAwareStreamInput input = new NamedWriteableAwareStreamInput(out.bytes().streamInput(), writableRegistry()); + // PowerMockito.whenNew(AnomalyDetector.class).withAnyArguments().thenReturn(detector); GetAnomalyDetectorResponse newResponse = new GetAnomalyDetectorResponse(input); Assert.assertNotNull(newResponse); } diff --git a/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorResponseTests.java b/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorResponseTests.java index 9a43fb40b..236cd2b58 100644 --- a/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorResponseTests.java +++ b/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorResponseTests.java @@ -16,8 +16,6 @@ import java.time.temporal.ChronoUnit; import java.util.Collection; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.TestHelpers; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.action.ActionResponse; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; @@ -28,6 +26,8 @@ import org.opensearch.plugins.Plugin; import org.opensearch.test.InternalSettingsPlugin; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -36,7 +36,7 @@ public class GetAnomalyDetectorResponseTests extends OpenSearchSingleNodeTestCas @Override protected Collection> getPlugins() { - return pluginList(InternalSettingsPlugin.class, AnomalyDetectorPlugin.class); + return pluginList(InternalSettingsPlugin.class, TimeSeriesAnalyticsPlugin.class); } @Override diff --git a/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorTests.java b/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorTests.java index 6b8ee81cc..ac5702edc 100644 --- a/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorTests.java +++ b/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorTests.java @@ -20,12 +20,9 @@ import static org.mockito.Mockito.mock; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; import java.io.IOException; import java.nio.ByteBuffer; -import java.time.Clock; import java.util.Arrays; import java.util.Collections; import java.util.HashSet; @@ -41,17 +38,10 @@ import org.opensearch.action.get.MultiGetResponse; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.PlainActionFuture; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.constant.CommonErrorMessages; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.ADTaskType; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.task.ADTaskManager; -import org.opensearch.ad.util.DiscoveryNodeFilterer; -import org.opensearch.ad.util.SecurityClientUtil; -import org.opensearch.ad.util.Throttler; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.ClusterSettings; @@ -60,10 +50,17 @@ import org.opensearch.core.common.bytes.BytesReference; import org.opensearch.index.get.GetResult; import org.opensearch.telemetry.tracing.noop.NoopTracer; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.Transport; import org.opensearch.transport.TransportService; -public class GetAnomalyDetectorTests extends AbstractADTest { +public class GetAnomalyDetectorTests extends AbstractTimeSeriesTest { private GetAnomalyDetectorTransportAction action; private TransportService transportService; private DiscoveryNodeFilterer nodeFilter; @@ -96,7 +93,7 @@ public void setUp() throws Exception { ClusterService clusterService = mock(ClusterService.class); ClusterSettings clusterSettings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES))) ); when(clusterService.getClusterSettings()).thenReturn(clusterSettings); @@ -118,9 +115,6 @@ public void setUp() throws Exception { client = mock(Client.class); when(client.threadPool()).thenReturn(threadPool); - Clock clock = mock(Clock.class); - Throttler throttler = new Throttler(clock); - NodeStateManager nodeStateManager = mock(NodeStateManager.class); clientUtil = new SecurityClientUtil(nodeStateManager, Settings.EMPTY); @@ -150,7 +144,7 @@ public void testInvalidRequest() throws IOException { future = new PlainActionFuture<>(); action.doExecute(null, request, future); - assertException(future, OpenSearchStatusException.class, CommonErrorMessages.EMPTY_PROFILES_COLLECT); + assertException(future, OpenSearchStatusException.class, CommonMessages.EMPTY_PROFILES_COLLECT); } @SuppressWarnings("unchecked") @@ -161,7 +155,7 @@ public void testValidRequest() throws IOException { ActionListener listener = (ActionListener) args[1]; String indexName = request.index(); - if (indexName.equals(ANOMALY_DETECTORS_INDEX)) { + if (indexName.equals(CommonName.CONFIG_INDEX)) { listener.onResponse(null); } return null; @@ -175,7 +169,7 @@ public void testValidRequest() throws IOException { future = new PlainActionFuture<>(); action.doExecute(null, request, future); - assertException(future, OpenSearchStatusException.class, CommonErrorMessages.FAIL_TO_FIND_DETECTOR_MSG); + assertException(future, OpenSearchStatusException.class, CommonMessages.FAIL_TO_FIND_CONFIG_MSG); } public void testGetTransportActionWithReturnTask() { @@ -221,13 +215,13 @@ private MultiGetResponse createMultiGetResponse() { ByteBuffer[] buffers = new ByteBuffer[0]; items[0] = new MultiGetItemResponse( new GetResponse( - new GetResult(ANOMALY_DETECTOR_JOB_INDEX, "test_1", 1, 1, 0, true, BytesReference.fromByteBuffers(buffers), null, null) + new GetResult(CommonName.JOB_INDEX, "test_1", 1, 1, 0, true, BytesReference.fromByteBuffers(buffers), null, null) ), null ); items[1] = new MultiGetItemResponse( new GetResponse( - new GetResult(ANOMALY_DETECTOR_JOB_INDEX, "test_2", 1, 1, 0, true, BytesReference.fromByteBuffers(buffers), null, null) + new GetResult(CommonName.JOB_INDEX, "test_2", 1, 1, 0, true, BytesReference.fromByteBuffers(buffers), null, null) ), null ); diff --git a/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorTransportActionTests.java index 6e4cda44a..35f6ba36f 100644 --- a/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/GetAnomalyDetectorTransportActionTests.java @@ -25,13 +25,9 @@ import org.junit.*; import org.mockito.Mockito; import org.opensearch.action.support.ActionFilters; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.ADTask; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.model.EntityProfile; import org.opensearch.ad.model.InitProgressProfile; import org.opensearch.ad.settings.AnomalyDetectorSettings; @@ -52,6 +48,13 @@ import org.opensearch.test.OpenSearchSingleNodeTestCase; import org.opensearch.threadpool.TestThreadPool; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; +import org.opensearch.timeseries.util.RestHandlerUtils; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.TransportService; import com.google.common.collect.ImmutableMap; @@ -83,7 +86,7 @@ public void setUp() throws Exception { ClusterService clusterService = mock(ClusterService.class); ClusterSettings clusterSettings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES))) ); when(clusterService.getClusterSettings()).thenReturn(clusterSettings); adTaskManager = mock(ADTaskManager.class); @@ -123,32 +126,14 @@ protected NamedWriteableRegistry writableRegistry() { @Test public void testGetTransportAction() throws IOException { - GetAnomalyDetectorRequest getAnomalyDetectorRequest = new GetAnomalyDetectorRequest( - "1234", - 4321, - false, - false, - "nonempty", - "", - false, - null - ); - action.doExecute(task, getAnomalyDetectorRequest, response); + GetAnomalyDetectorRequest getConfigRequest = new GetAnomalyDetectorRequest("1234", 4321, false, false, "nonempty", "", false, null); + action.doExecute(task, getConfigRequest, response); } @Test public void testGetTransportActionWithReturnJob() throws IOException { - GetAnomalyDetectorRequest getAnomalyDetectorRequest = new GetAnomalyDetectorRequest( - "1234", - 4321, - true, - false, - "", - "abcd", - false, - null - ); - action.doExecute(task, getAnomalyDetectorRequest, response); + GetAnomalyDetectorRequest getConfigRequest = new GetAnomalyDetectorRequest("1234", 4321, true, false, "", "abcd", false, null); + action.doExecute(task, getConfigRequest, response); } @Test @@ -184,7 +169,7 @@ public void testGetAnomalyDetectorRequestNoEntityValue() throws IOException { public void testGetAnomalyDetectorResponse() throws IOException { BytesStreamOutput out = new BytesStreamOutput(); AnomalyDetector detector = TestHelpers.randomAnomalyDetector(ImmutableMap.of("testKey", "testValue"), Instant.now()); - AnomalyDetectorJob adJob = TestHelpers.randomAnomalyDetectorJob(); + Job adJob = TestHelpers.randomAnomalyDetectorJob(); GetAnomalyDetectorResponse response = new GetAnomalyDetectorResponse( 4321, "1234", @@ -218,7 +203,7 @@ public void testGetAnomalyDetectorResponse() throws IOException { public void testGetAnomalyDetectorProfileResponse() throws IOException { BytesStreamOutput out = new BytesStreamOutput(); AnomalyDetector detector = TestHelpers.randomAnomalyDetector(ImmutableMap.of("testKey", "testValue"), Instant.now()); - AnomalyDetectorJob adJob = TestHelpers.randomAnomalyDetectorJob(); + Job adJob = TestHelpers.randomAnomalyDetectorJob(); InitProgressProfile initProgress = new InitProgressProfile("99%", 2L, 2); EntityProfile entityProfile = new EntityProfile.Builder().initProgress(initProgress).build(); GetAnomalyDetectorResponse response = new GetAnomalyDetectorResponse( @@ -245,7 +230,7 @@ public void testGetAnomalyDetectorProfileResponse() throws IOException { // {init_progress={percentage=99%, estimated_minutes_left=2, needed_shingles=2}} Map map = TestHelpers.XContentBuilderToMap(builder); - Map parsedInitProgress = (Map) (map.get(CommonName.INIT_PROGRESS)); + Map parsedInitProgress = (Map) (map.get(ADCommonName.INIT_PROGRESS)); Assert.assertEquals(initProgress.getPercentage(), parsedInitProgress.get(InitProgressProfile.PERCENTAGE).toString()); assertTrue(initProgress.toString().contains("[percentage=99%,estimated_minutes_left=2,needed_shingles=2]")); Assert diff --git a/src/test/java/org/opensearch/ad/transport/IndexAnomalyDetectorActionTests.java b/src/test/java/org/opensearch/ad/transport/IndexAnomalyDetectorActionTests.java index 604976d09..f29030912 100644 --- a/src/test/java/org/opensearch/ad/transport/IndexAnomalyDetectorActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/IndexAnomalyDetectorActionTests.java @@ -18,9 +18,7 @@ import org.junit.Before; import org.junit.Test; import org.opensearch.action.support.WriteRequest; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.common.unit.TimeValue; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; @@ -30,6 +28,8 @@ import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.rest.RestRequest; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.util.RestHandlerUtils; import com.google.common.collect.ImmutableMap; diff --git a/src/test/java/org/opensearch/ad/transport/IndexAnomalyDetectorTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/IndexAnomalyDetectorTransportActionTests.java index f24617192..d370fa703 100644 --- a/src/test/java/org/opensearch/ad/transport/IndexAnomalyDetectorTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/IndexAnomalyDetectorTransportActionTests.java @@ -36,14 +36,10 @@ import org.opensearch.action.search.ShardSearchFailure; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.WriteRequest; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.feature.SearchFeatureDao; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.task.ADTaskManager; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.ClusterState; @@ -62,6 +58,11 @@ import org.opensearch.tasks.Task; import org.opensearch.test.OpenSearchIntegTestCase; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.feature.SearchFeatureDao; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.TransportService; import com.google.common.collect.ImmutableMap; @@ -86,7 +87,7 @@ public void setUp() throws Exception { clusterService = mock(ClusterService.class); clusterSettings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES))) ); when(clusterService.getClusterSettings()).thenReturn(clusterSettings); @@ -98,9 +99,9 @@ public void setUp() throws Exception { .put(IndexMetadata.SETTING_VERSION_CREATED, Version.CURRENT) .build(); final Settings.Builder existingSettings = Settings.builder().put(indexSettings).put(IndexMetadata.SETTING_INDEX_UUID, "test2UUID"); - IndexMetadata indexMetaData = IndexMetadata.builder(AnomalyDetector.ANOMALY_DETECTORS_INDEX).settings(existingSettings).build(); + IndexMetadata indexMetaData = IndexMetadata.builder(CommonName.CONFIG_INDEX).settings(existingSettings).build(); final Map indices = new HashMap<>(); - indices.put(AnomalyDetector.ANOMALY_DETECTORS_INDEX, indexMetaData); + indices.put(CommonName.CONFIG_INDEX, indexMetaData); ClusterState clusterState = ClusterState.builder(clusterName).metadata(Metadata.builder().indices(indices).build()).build(); when(clusterService.state()).thenReturn(clusterState); @@ -115,15 +116,14 @@ public void setUp() throws Exception { clientUtil, clusterService, indexSettings(), - mock(AnomalyDetectionIndices.class), + mock(ADIndexManagement.class), xContentRegistry(), adTaskManager, searchFeatureDao ); task = mock(Task.class); AnomalyDetector detector = TestHelpers.randomAnomalyDetector(ImmutableMap.of("testKey", "testValue"), Instant.now()); - GetResponse getDetectorResponse = TestHelpers - .createGetResponse(detector, detector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX); + GetResponse getDetectorResponse = TestHelpers.createGetResponse(detector, detector.getId(), CommonName.CONFIG_INDEX); doAnswer(invocation -> { Object[] args = invocation.getArguments(); assertTrue( @@ -199,7 +199,7 @@ public void testIndexTransportAction() { @Test public void testIndexTransportActionWithUserAndFilterOn() { - Settings settings = Settings.builder().put(AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES.getKey(), true).build(); + Settings settings = Settings.builder().put(AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES.getKey(), true).build(); ThreadContext threadContext = new ThreadContext(settings); threadContext.putTransient(ConfigConstants.OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT, "alice|odfe,aes|engineering,operations"); when(clusterService.getClusterSettings()).thenReturn(clusterSettings); @@ -214,7 +214,7 @@ public void testIndexTransportActionWithUserAndFilterOn() { clientUtil, clusterService, settings, - mock(AnomalyDetectionIndices.class), + mock(ADIndexManagement.class), xContentRegistry(), adTaskManager, searchFeatureDao @@ -240,7 +240,7 @@ public void testIndexTransportActionWithUserAndFilterOff() { clientUtil, clusterService, settings, - mock(AnomalyDetectionIndices.class), + mock(ADIndexManagement.class), xContentRegistry(), adTaskManager, searchFeatureDao diff --git a/src/test/java/org/opensearch/ad/transport/MultiEntityResultTests.java b/src/test/java/org/opensearch/ad/transport/MultiEntityResultTests.java index 7d7e4de47..94e07fe3c 100644 --- a/src/test/java/org/opensearch/ad/transport/MultiEntityResultTests.java +++ b/src/test/java/org/opensearch/ad/transport/MultiEntityResultTests.java @@ -24,10 +24,8 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.BACKOFF_MINUTES; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_ENTITIES_PER_QUERY; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.PAGE_SIZE; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_MAX_ENTITIES_PER_QUERY; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_PAGE_SIZE; import java.io.IOException; import java.time.Clock; @@ -56,7 +54,6 @@ import org.mockito.ArgumentCaptor; import org.mockito.ArgumentMatcher; import org.mockito.stubbing.Answer; -import org.opensearch.BwcTests; import org.opensearch.OpenSearchTimeoutException; import org.opensearch.Version; import org.opensearch.action.get.GetRequest; @@ -69,25 +66,15 @@ import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.PlainActionFuture; import org.opensearch.action.support.master.AcknowledgedResponse; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.caching.EntityCache; import org.opensearch.ad.cluster.HashRing; -import org.opensearch.ad.common.exception.EndRunException; -import org.opensearch.ad.common.exception.InternalFailure; -import org.opensearch.ad.common.exception.LimitExceededException; -import org.opensearch.ad.constant.CommonErrorMessages; import org.opensearch.ad.feature.CompositeRetriever; import org.opensearch.ad.feature.FeatureManager; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.ml.ModelManager; import org.opensearch.ad.ml.ThresholdingResult; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.ad.ratelimit.CheckpointReadWorker; import org.opensearch.ad.ratelimit.ColdEntityWorker; import org.opensearch.ad.ratelimit.EntityColdStartWorker; @@ -96,13 +83,8 @@ import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.stats.ADStat; import org.opensearch.ad.stats.ADStats; -import org.opensearch.ad.stats.StatNames; import org.opensearch.ad.stats.suppliers.CounterSupplier; import org.opensearch.ad.task.ADTaskManager; -import org.opensearch.ad.util.Bwc; -import org.opensearch.ad.util.ClientUtil; -import org.opensearch.ad.util.SecurityClientUtil; -import org.opensearch.ad.util.Throttler; import org.opensearch.client.Client; import org.opensearch.cluster.metadata.IndexNameExpressionResolver; import org.opensearch.cluster.node.DiscoveryNode; @@ -126,6 +108,22 @@ import org.opensearch.test.ClusterServiceUtils; import org.opensearch.test.OpenSearchTestCase; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.common.exception.InternalFailure; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.stats.StatNames; +import org.opensearch.timeseries.util.ClientUtil; +import org.opensearch.timeseries.util.SecurityClientUtil; import org.opensearch.transport.Transport; import org.opensearch.transport.TransportException; import org.opensearch.transport.TransportInterceptor; @@ -139,7 +137,7 @@ import test.org.opensearch.ad.util.MLUtil; import test.org.opensearch.ad.util.RandomModelStateConfig; -public class MultiEntityResultTests extends AbstractADTest { +public class MultiEntityResultTests extends AbstractTimeSeriesTest { private AnomalyResultTransportAction action; private AnomalyResultRequest request; private TransportInterceptor entityResultInterceptor; @@ -155,13 +153,13 @@ public class MultiEntityResultTests extends AbstractADTest { private HashRing hashRing; private ClusterService clusterService; private IndexNameExpressionResolver indexNameResolver; - private ADCircuitBreakerService adCircuitBreakerService; + private CircuitBreakerService adCircuitBreakerService; private ADStats adStats; private ThreadPool mockThreadPool; private String detectorId; private Instant now; private CacheProvider provider; - private AnomalyDetectionIndices indexUtil; + private ADIndexManagement indexUtil; private ResultWriteWorker resultWriteQueue; private CheckpointReadWorker checkpointReadQueue; private EntityColdStartWorker entityColdStartQueue; @@ -179,7 +177,6 @@ public class MultiEntityResultTests extends AbstractADTest { @BeforeClass public static void setUpBeforeClass() { setUpThreadPool(AnomalyResultTests.class.getSimpleName()); - Bwc.DISABLE_BWC = false; } @AfterClass @@ -203,12 +200,12 @@ public void setUp() throws Exception { stateManager = mock(NodeStateManager.class); // make sure parameters are not null, otherwise this mock won't get invoked doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(stateManager).getAnomalyDetector(anyString(), any(ActionListener.class)); + }).when(stateManager).getConfig(anyString(), eq(AnalysisType.AD), any(ActionListener.class)); - settings = Settings.builder().put(AnomalyDetectorSettings.COOLDOWN_MINUTES.getKey(), TimeValue.timeValueMinutes(5)).build(); + settings = Settings.builder().put(AnomalyDetectorSettings.AD_COOLDOWN_MINUTES.getKey(), TimeValue.timeValueMinutes(5)).build(); // make sure end time is larger enough than Clock.systemUTC().millis() to get PageIterator.hasNext() to pass request = new AnomalyResultRequest(detectorId, 100, Clock.systemUTC().millis() + 100_000); @@ -230,10 +227,10 @@ public void setUp() throws Exception { hashRing = mock(HashRing.class); Set> anomalyResultSetting = new HashSet<>(ClusterSettings.BUILT_IN_CLUSTER_SETTINGS); - anomalyResultSetting.add(MAX_ENTITIES_PER_QUERY); - anomalyResultSetting.add(PAGE_SIZE); - anomalyResultSetting.add(MAX_RETRY_FOR_UNRESPONSIVE_NODE); - anomalyResultSetting.add(BACKOFF_MINUTES); + anomalyResultSetting.add(AD_MAX_ENTITIES_PER_QUERY); + anomalyResultSetting.add(AD_PAGE_SIZE); + anomalyResultSetting.add(TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE); + anomalyResultSetting.add(TimeSeriesSettings.BACKOFF_MINUTES); ClusterSettings clusterSettings = new ClusterSettings(Settings.EMPTY, anomalyResultSetting); DiscoveryNode discoveryNode = new DiscoveryNode( @@ -248,7 +245,7 @@ public void setUp() throws Exception { indexNameResolver = new IndexNameExpressionResolver(new ThreadContext(Settings.EMPTY)); - adCircuitBreakerService = mock(ADCircuitBreakerService.class); + adCircuitBreakerService = mock(CircuitBreakerService.class); when(adCircuitBreakerService.isOpen()).thenReturn(false); Map> statsMap = new HashMap>() { @@ -302,7 +299,7 @@ public void setUp() throws Exception { .thenReturn(MLUtil.randomModelState(new RandomModelStateConfig.Builder().fullModel(true).build())); when(entityCache.selectUpdateCandidate(any(), any(), any())).thenReturn(Pair.of(new ArrayList(), new ArrayList())); - indexUtil = mock(AnomalyDetectionIndices.class); + indexUtil = mock(ADIndexManagement.class); resultWriteQueue = mock(ResultWriteWorker.class); checkpointReadQueue = mock(CheckpointReadWorker.class); entityColdStartQueue = mock(EntityColdStartWorker.class); @@ -336,7 +333,7 @@ public void testColdStartEndRunException() { .of( new EndRunException( detectorId, - CommonErrorMessages.INVALID_SEARCH_QUERY_MSG, + CommonMessages.INVALID_SEARCH_QUERY_MSG, new NoSuchElementException("No value present"), false ) @@ -344,7 +341,7 @@ public void testColdStartEndRunException() { ); PlainActionFuture listener = new PlainActionFuture<>(); action.doExecute(null, request, listener); - assertException(listener, EndRunException.class, CommonErrorMessages.INVALID_SEARCH_QUERY_MSG); + assertException(listener, EndRunException.class, CommonMessages.INVALID_SEARCH_QUERY_MSG); } // a handler that forwards response or exception received from network @@ -352,7 +349,6 @@ private TransportResponseHandler entityResultHa return new TransportResponseHandler() { @Override public T read(StreamInput in) throws IOException { - in.setVersion(BwcTests.V_1_1_0); return handler.read(in); } @@ -378,7 +374,6 @@ private TransportResponseHandler unackEntityRes return new TransportResponseHandler() { @Override public T read(StreamInput in) throws IOException { - in.setVersion(BwcTests.V_1_1_0); return handler.read(in); } @@ -436,7 +431,7 @@ public void setUpNormlaStateManager() throws IOException { .build(); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(1); - listener.onResponse(TestHelpers.createGetResponse(detector, detectorId, AnomalyDetector.ANOMALY_DETECTORS_INDEX)); + listener.onResponse(TestHelpers.createGetResponse(detector, detectorId, CommonName.CONFIG_INDEX)); return null; }).when(client).get(any(GetRequest.class), any(ActionListener.class)); @@ -444,10 +439,12 @@ public void setUpNormlaStateManager() throws IOException { client, xContentRegistry(), settings, - new ClientUtil(settings, client, new Throttler(mock(Clock.class)), threadPool), + new ClientUtil(client), clock, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, - clusterService + TimeSeriesSettings.HOURLY_MAINTENANCE, + clusterService, + TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE, + TimeSeriesSettings.BACKOFF_MINUTES ); clientUtil = new SecurityClientUtil(stateManager, settings); @@ -511,7 +508,7 @@ public void testQueryErrorEndRunNotNow() throws InterruptedException, IOExceptio action.doExecute(null, request, listener2); Exception e = expectThrows(EndRunException.class, () -> listener2.actionGet(10000L)); // wrapped INVALID_SEARCH_QUERY_MSG around SearchPhaseExecutionException by convertedQueryFailureException - assertThat("actual message: " + e.getMessage(), e.getMessage(), containsString(CommonErrorMessages.INVALID_SEARCH_QUERY_MSG)); + assertThat("actual message: " + e.getMessage(), e.getMessage(), containsString(CommonMessages.INVALID_SEARCH_QUERY_MSG)); assertThat("actual message: " + e.getMessage(), e.getMessage(), containsString(allShardsFailedMsg)); // not end now assertTrue(!((EndRunException) e).isEndNow()); @@ -683,7 +680,7 @@ public void sendRequest( // we start support multi-category fields since 1.1 // Set version to 1.1 will force the outbound/inbound message to use 1.1 version - setupTestNodes(entityResultInterceptor, 5, settings, BwcTests.V_1_1_0, MAX_ENTITIES_PER_QUERY, PAGE_SIZE); + setupTestNodes(entityResultInterceptor, 5, settings, Version.V_2_0_0, AD_MAX_ENTITIES_PER_QUERY, AD_PAGE_SIZE); TransportService realTransportService = testNodes[0].transportService; ClusterService realClusterService = testNodes[0].clusterService; @@ -748,7 +745,7 @@ public void testCircuitBreakerOpen() throws InterruptedException, IOException { ClientUtil clientUtil = mock(ClientUtil.class); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(2); - listener.onResponse(TestHelpers.createGetResponse(detector, detectorId, AnomalyDetector.ANOMALY_DETECTORS_INDEX)); + listener.onResponse(TestHelpers.createGetResponse(detector, detectorId, CommonName.CONFIG_INDEX)); return null; }).when(clientUtil).asyncRequest(any(GetRequest.class), any(), any(ActionListener.class)); @@ -758,8 +755,10 @@ public void testCircuitBreakerOpen() throws InterruptedException, IOException { settings, clientUtil, clock, - AnomalyDetectorSettings.HOURLY_MAINTENANCE, - clusterService + TimeSeriesSettings.HOURLY_MAINTENANCE, + clusterService, + TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE, + TimeSeriesSettings.BACKOFF_MINUTES ); NodeStateManager spyStateManager = spy(stateManager); @@ -770,7 +769,7 @@ public void testCircuitBreakerOpen() throws InterruptedException, IOException { when(hashRing.getOwningNodeWithSameLocalAdVersionForRealtimeAD(any(String.class))) .thenReturn(Optional.of(testNodes[1].discoveryNode())); - ADCircuitBreakerService openBreaker = mock(ADCircuitBreakerService.class); + CircuitBreakerService openBreaker = mock(CircuitBreakerService.class); when(openBreaker.isOpen()).thenReturn(true); // register entity result action @@ -810,7 +809,7 @@ public void testCircuitBreakerOpen() throws InterruptedException, IOException { listener = new PlainActionFuture<>(); action.doExecute(null, request, listener); - assertException(listener, LimitExceededException.class, CommonErrorMessages.MEMORY_CIRCUIT_BROKEN_ERR_MSG); + assertException(listener, LimitExceededException.class, CommonMessages.MEMORY_CIRCUIT_BROKEN_ERR_MSG); } public void testNotAck() throws InterruptedException, IOException { @@ -1215,11 +1214,11 @@ private NodeStateManager setUpTestExceptionTestingInModelNode() throws IOExcepti CountDownLatch modelNodeInProgress = new CountDownLatch(1); // make sure parameters are not null, otherwise this mock won't get invoked doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); modelNodeInProgress.countDown(); return null; - }).when(modelNodeStateManager).getAnomalyDetector(anyString(), any(ActionListener.class)); + }).when(modelNodeStateManager).getConfig(anyString(), eq(AnalysisType.AD), any(ActionListener.class)); return modelNodeStateManager; } @@ -1233,7 +1232,7 @@ public void testEndRunNowInModelNode() throws InterruptedException, IOException .of( new EndRunException( detectorId, - CommonErrorMessages.INVALID_SEARCH_QUERY_MSG, + CommonMessages.INVALID_SEARCH_QUERY_MSG, new NoSuchElementException("No value present"), true ) @@ -1246,7 +1245,7 @@ public void testEndRunNowInModelNode() throws InterruptedException, IOException .of( new EndRunException( detectorId, - CommonErrorMessages.INVALID_SEARCH_QUERY_MSG, + CommonMessages.INVALID_SEARCH_QUERY_MSG, new NoSuchElementException("No value present"), true ) @@ -1277,7 +1276,7 @@ public void testEndRunNowFalseInModelNode() throws InterruptedException, IOExcep .of( new EndRunException( detectorId, - CommonErrorMessages.INVALID_SEARCH_QUERY_MSG, + CommonMessages.INVALID_SEARCH_QUERY_MSG, new NoSuchElementException("No value present"), false ) diff --git a/src/test/java/org/opensearch/ad/transport/PreviewAnomalyDetectorActionTests.java b/src/test/java/org/opensearch/ad/transport/PreviewAnomalyDetectorActionTests.java index 24d1ea07c..73c67fe79 100644 --- a/src/test/java/org/opensearch/ad/transport/PreviewAnomalyDetectorActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/PreviewAnomalyDetectorActionTests.java @@ -15,7 +15,6 @@ import org.junit.Assert; import org.junit.Test; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.common.io.stream.BytesStreamOutput; @@ -23,6 +22,7 @@ import org.opensearch.core.common.io.stream.NamedWriteableRegistry; import org.opensearch.core.xcontent.ToXContent; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -47,7 +47,7 @@ public void testPreviewRequest() throws Exception { request.writeTo(out); NamedWriteableAwareStreamInput input = new NamedWriteableAwareStreamInput(out.bytes().streamInput(), writableRegistry()); PreviewAnomalyDetectorRequest newRequest = new PreviewAnomalyDetectorRequest(input); - Assert.assertEquals(request.getDetectorId(), newRequest.getDetectorId()); + Assert.assertEquals(request.getId(), newRequest.getId()); Assert.assertEquals(request.getStartTime(), newRequest.getStartTime()); Assert.assertEquals(request.getEndTime(), newRequest.getEndTime()); Assert.assertNotNull(newRequest.getDetector()); diff --git a/src/test/java/org/opensearch/ad/transport/PreviewAnomalyDetectorTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/PreviewAnomalyDetectorTransportActionTests.java index 5a0abbd62..38cdce966 100644 --- a/src/test/java/org/opensearch/ad/transport/PreviewAnomalyDetectorTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/PreviewAnomalyDetectorTransportActionTests.java @@ -45,17 +45,13 @@ import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.WriteRequest; import org.opensearch.ad.AnomalyDetectorRunner; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.breaker.ADCircuitBreakerService; -import org.opensearch.ad.constant.CommonErrorMessages; import org.opensearch.ad.feature.FeatureManager; import org.opensearch.ad.feature.Features; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.ml.ModelManager; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.RestHandlerUtils; import org.opensearch.client.Client; import org.opensearch.cluster.ClusterName; import org.opensearch.cluster.ClusterState; @@ -74,6 +70,11 @@ import org.opensearch.tasks.Task; import org.opensearch.test.OpenSearchSingleNodeTestCase; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.util.RestHandlerUtils; import org.opensearch.transport.TransportService; import com.google.common.collect.ImmutableMap; @@ -86,7 +87,7 @@ public class PreviewAnomalyDetectorTransportActionTests extends OpenSearchSingle private FeatureManager featureManager; private ModelManager modelManager; private Task task; - private ADCircuitBreakerService circuitBreaker; + private CircuitBreakerService circuitBreaker; @Override @Before @@ -102,8 +103,8 @@ public void setUp() throws Exception { Arrays .asList( AnomalyDetectorSettings.MAX_ANOMALY_FEATURES, - AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES, - AnomalyDetectorSettings.PAGE_SIZE, + AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES, + AnomalyDetectorSettings.AD_PAGE_SIZE, AnomalyDetectorSettings.MAX_CONCURRENT_PREVIEW ) ) @@ -119,16 +120,16 @@ public void setUp() throws Exception { .put(IndexMetadata.SETTING_VERSION_CREATED, Version.CURRENT) .build(); final Settings.Builder existingSettings = Settings.builder().put(indexSettings).put(IndexMetadata.SETTING_INDEX_UUID, "test2UUID"); - IndexMetadata indexMetaData = IndexMetadata.builder(AnomalyDetector.ANOMALY_DETECTORS_INDEX).settings(existingSettings).build(); + IndexMetadata indexMetaData = IndexMetadata.builder(CommonName.CONFIG_INDEX).settings(existingSettings).build(); final Map indices = new HashMap<>(); - indices.put(AnomalyDetector.ANOMALY_DETECTORS_INDEX, indexMetaData); + indices.put(CommonName.CONFIG_INDEX, indexMetaData); ClusterState clusterState = ClusterState.builder(clusterName).metadata(Metadata.builder().indices(indices).build()).build(); when(clusterService.state()).thenReturn(clusterState); featureManager = mock(FeatureManager.class); modelManager = mock(ModelManager.class); runner = new AnomalyDetectorRunner(modelManager, featureManager, AnomalyDetectorSettings.MAX_PREVIEW_RESULTS); - circuitBreaker = mock(ADCircuitBreakerService.class); + circuitBreaker = mock(CircuitBreakerService.class); when(circuitBreaker.isOpen()).thenReturn(false); action = new PreviewAnomalyDetectorTransportAction( Settings.EMPTY, @@ -147,12 +148,7 @@ public void setUp() throws Exception { public void testPreviewTransportAction() throws IOException, InterruptedException { final CountDownLatch inProgressLatch = new CountDownLatch(1); AnomalyDetector detector = TestHelpers.randomAnomalyDetector(ImmutableMap.of("testKey", "testValue"), Instant.now()); - PreviewAnomalyDetectorRequest request = new PreviewAnomalyDetectorRequest( - detector, - detector.getDetectorId(), - Instant.now(), - Instant.now() - ); + PreviewAnomalyDetectorRequest request = new PreviewAnomalyDetectorRequest(detector, detector.getId(), Instant.now(), Instant.now()); ActionListener previewResponse = new ActionListener() { @Override public void onResponse(PreviewAnomalyDetectorResponse response) { @@ -194,12 +190,7 @@ public void testPreviewTransportActionWithNoFeature() throws IOException, Interr // Detector with no feature, Preview should fail final CountDownLatch inProgressLatch = new CountDownLatch(1); AnomalyDetector detector = TestHelpers.randomAnomalyDetector(Collections.emptyList()); - PreviewAnomalyDetectorRequest request = new PreviewAnomalyDetectorRequest( - detector, - detector.getDetectorId(), - Instant.now(), - Instant.now() - ); + PreviewAnomalyDetectorRequest request = new PreviewAnomalyDetectorRequest(detector, detector.getId(), Instant.now(), Instant.now()); ActionListener previewResponse = new ActionListener() { @Override public void onResponse(PreviewAnomalyDetectorResponse response) { @@ -264,7 +255,7 @@ public void testPreviewTransportActionWithIndex() throws IOException, Interrupte final CountDownLatch inProgressLatch = new CountDownLatch(1); PreviewAnomalyDetectorRequest request = new PreviewAnomalyDetectorRequest(null, "1234", Instant.now(), Instant.now()); Settings indexSettings = Settings.builder().put("index.number_of_shards", 5).put("index.number_of_replicas", 1).build(); - CreateIndexRequest indexRequest = new CreateIndexRequest(AnomalyDetector.ANOMALY_DETECTORS_INDEX, indexSettings); + CreateIndexRequest indexRequest = new CreateIndexRequest(CommonName.CONFIG_INDEX, indexSettings); client().admin().indices().create(indexRequest).actionGet(); ActionListener previewResponse = new ActionListener() { @Override @@ -286,7 +277,7 @@ public void onFailure(Exception e) { @Test public void testPreviewTransportActionNoContext() throws IOException, InterruptedException { final CountDownLatch inProgressLatch = new CountDownLatch(1); - Settings settings = Settings.builder().put(AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES.getKey(), true).build(); + Settings settings = Settings.builder().put(AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES.getKey(), true).build(); Client client = mock(Client.class); ThreadContext threadContext = new ThreadContext(settings); threadContext.putTransient(ConfigConstants.OPENSEARCH_SECURITY_USER_INFO_THREAD_CONTEXT, "alice|odfe,aes|engineering,operations"); @@ -304,15 +295,9 @@ public void testPreviewTransportActionNoContext() throws IOException, Interrupte circuitBreaker ); AnomalyDetector detector = TestHelpers.randomAnomalyDetector(ImmutableMap.of("testKey", "testValue"), Instant.now()); - PreviewAnomalyDetectorRequest request = new PreviewAnomalyDetectorRequest( - detector, - detector.getDetectorId(), - Instant.now(), - Instant.now() - ); + PreviewAnomalyDetectorRequest request = new PreviewAnomalyDetectorRequest(detector, detector.getId(), Instant.now(), Instant.now()); - GetResponse getDetectorResponse = TestHelpers - .createGetResponse(detector, detector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX); + GetResponse getDetectorResponse = TestHelpers.createGetResponse(detector, detector.getId(), CommonName.CONFIG_INDEX); doAnswer(invocation -> { Object[] args = invocation.getArguments(); assertTrue( @@ -349,11 +334,11 @@ public void onFailure(Exception e) { public void testPreviewTransportActionWithDetector() throws IOException, InterruptedException { final CountDownLatch inProgressLatch = new CountDownLatch(1); CreateIndexResponse createResponse = TestHelpers - .createIndex(client().admin(), AnomalyDetector.ANOMALY_DETECTORS_INDEX, AnomalyDetectionIndices.getAnomalyDetectorMappings()); + .createIndex(client().admin(), CommonName.CONFIG_INDEX, ADIndexManagement.getConfigMappings()); Assert.assertNotNull(createResponse); AnomalyDetector detector = TestHelpers.randomAnomalyDetector(ImmutableMap.of("testKey", "testValue"), Instant.now()); - IndexRequest indexRequest = new IndexRequest(AnomalyDetector.ANOMALY_DETECTORS_INDEX) + IndexRequest indexRequest = new IndexRequest(CommonName.CONFIG_INDEX) .setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE) .source(detector.toXContent(XContentFactory.jsonBuilder(), RestHandlerUtils.XCONTENT_WITH_TYPE)); IndexResponse indexResponse = client().index(indexRequest).actionGet(5_000); @@ -404,12 +389,7 @@ public void onFailure(Exception e) { public void testCircuitBreakerOpen() throws IOException, InterruptedException { // preview has no detector id AnomalyDetector detector = TestHelpers.randomAnomalyDetectorUsingCategoryFields(null, Arrays.asList("a")); - PreviewAnomalyDetectorRequest request = new PreviewAnomalyDetectorRequest( - detector, - detector.getDetectorId(), - Instant.now(), - Instant.now() - ); + PreviewAnomalyDetectorRequest request = new PreviewAnomalyDetectorRequest(detector, detector.getId(), Instant.now(), Instant.now()); when(circuitBreaker.isOpen()).thenReturn(true); @@ -423,7 +403,7 @@ public void onResponse(PreviewAnomalyDetectorResponse response) { @Override public void onFailure(Exception e) { Assert.assertTrue("actual class: " + e.getClass(), e instanceof OpenSearchStatusException); - Assert.assertTrue(e.getMessage().contains(CommonErrorMessages.MEMORY_CIRCUIT_BROKEN_ERR_MSG)); + Assert.assertTrue(e.getMessage().contains(CommonMessages.MEMORY_CIRCUIT_BROKEN_ERR_MSG)); inProgressLatch.countDown(); } }; diff --git a/src/test/java/org/opensearch/ad/transport/ProfileITTests.java b/src/test/java/org/opensearch/ad/transport/ProfileITTests.java index e9aac6377..013f00097 100644 --- a/src/test/java/org/opensearch/ad/transport/ProfileITTests.java +++ b/src/test/java/org/opensearch/ad/transport/ProfileITTests.java @@ -16,20 +16,20 @@ import java.util.HashSet; import java.util.concurrent.ExecutionException; -import org.opensearch.ad.AnomalyDetectorPlugin; import org.opensearch.ad.model.DetectorProfileName; import org.opensearch.plugins.Plugin; import org.opensearch.test.OpenSearchIntegTestCase; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; public class ProfileITTests extends OpenSearchIntegTestCase { @Override protected Collection> nodePlugins() { - return Collections.singletonList(AnomalyDetectorPlugin.class); + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); } protected Collection> transportClientPlugins() { - return Collections.singletonList(AnomalyDetectorPlugin.class); + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); } public void testNormalProfile() throws ExecutionException, InterruptedException { diff --git a/src/test/java/org/opensearch/ad/transport/ProfileTests.java b/src/test/java/org/opensearch/ad/transport/ProfileTests.java index 0ece203f6..7df0d5e02 100644 --- a/src/test/java/org/opensearch/ad/transport/ProfileTests.java +++ b/src/test/java/org/opensearch/ad/transport/ProfileTests.java @@ -30,7 +30,7 @@ import org.opensearch.Version; import org.opensearch.action.FailedNodeException; import org.opensearch.ad.common.exception.JsonPathNotFoundException; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.DetectorProfileName; import org.opensearch.ad.model.ModelProfileOnNode; import org.opensearch.cluster.ClusterName; @@ -41,6 +41,7 @@ import org.opensearch.core.xcontent.ToXContent; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.constant.CommonName; import com.google.gson.JsonArray; import com.google.gson.JsonElement; @@ -115,7 +116,7 @@ public void testProfileNodeRequest() throws IOException { profilesToRetrieve.add(DetectorProfileName.COORDINATING_NODE); ProfileRequest ProfileRequest = new ProfileRequest(detectorId, profilesToRetrieve, false); ProfileNodeRequest ProfileNodeRequest = new ProfileNodeRequest(ProfileRequest); - assertEquals("ProfileNodeRequest has the wrong detector id", ProfileNodeRequest.getDetectorId(), detectorId); + assertEquals("ProfileNodeRequest has the wrong detector id", ProfileNodeRequest.getId(), detectorId); assertEquals("ProfileNodeRequest has the wrong ProfileRequest", ProfileNodeRequest.getProfilesToBeRetrieved(), profilesToRetrieve); // Test serialization @@ -123,7 +124,7 @@ public void testProfileNodeRequest() throws IOException { ProfileNodeRequest.writeTo(output); StreamInput streamInput = output.bytes().streamInput(); ProfileNodeRequest nodeRequest = new ProfileNodeRequest(streamInput); - assertEquals("serialization has the wrong detector id", nodeRequest.getDetectorId(), detectorId); + assertEquals("serialization has the wrong detector id", nodeRequest.getId(), detectorId); assertEquals("serialization has the wrong ProfileRequest", nodeRequest.getProfilesToBeRetrieved(), profilesToRetrieve); } @@ -161,7 +162,7 @@ public void testProfileNodeResponse() throws IOException, JsonPathNotFoundExcept ); } - assertEquals("toXContent has the wrong shingle size", JsonDeserializer.getIntValue(json, CommonName.SHINGLE_SIZE), shingleSize); + assertEquals("toXContent has the wrong shingle size", JsonDeserializer.getIntValue(json, ADCommonName.SHINGLE_SIZE), shingleSize); } @Test @@ -181,7 +182,7 @@ public void testProfileRequest() throws IOException { readRequest.getProfilesToBeRetrieved(), profileRequest.getProfilesToBeRetrieved() ); - assertEquals("Serialization has the wrong detector id", readRequest.getDetectorId(), profileRequest.getDetectorId()); + assertEquals("Serialization has the wrong detector id", readRequest.getId(), profileRequest.getId()); } @Test @@ -249,8 +250,8 @@ public void testProfileResponse() throws IOException, JsonPathNotFoundException JsonElement element = modelsJson.get(i); assertTrue( "toXContent has the wrong model id", - JsonDeserializer.getTextValue(element, CommonName.MODEL_ID_KEY).equals(model1Id) - || JsonDeserializer.getTextValue(element, CommonName.MODEL_ID_KEY).equals(model0Id) + JsonDeserializer.getTextValue(element, CommonName.MODEL_ID_FIELD).equals(model1Id) + || JsonDeserializer.getTextValue(element, CommonName.MODEL_ID_FIELD).equals(model0Id) ); assertEquals( @@ -259,7 +260,7 @@ public void testProfileResponse() throws IOException, JsonPathNotFoundException modelSize ); - if (JsonDeserializer.getTextValue(element, CommonName.MODEL_ID_KEY).equals(model1Id)) { + if (JsonDeserializer.getTextValue(element, CommonName.MODEL_ID_FIELD).equals(model1Id)) { assertEquals("toXContent has the wrong node id", JsonDeserializer.getTextValue(element, ModelProfileOnNode.NODE_ID), node1); } else { assertEquals("toXContent has the wrong node id", JsonDeserializer.getTextValue(element, ModelProfileOnNode.NODE_ID), node2); diff --git a/src/test/java/org/opensearch/ad/transport/ProfileTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/ProfileTransportActionTests.java index f0788dc5e..bccd385bb 100644 --- a/src/test/java/org/opensearch/ad/transport/ProfileTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/ProfileTransportActionTests.java @@ -29,19 +29,19 @@ import org.junit.Test; import org.opensearch.action.FailedNodeException; import org.opensearch.action.support.ActionFilters; -import org.opensearch.ad.AnomalyDetectorPlugin; import org.opensearch.ad.caching.CacheProvider; import org.opensearch.ad.caching.EntityCache; import org.opensearch.ad.feature.FeatureManager; import org.opensearch.ad.ml.ModelManager; import org.opensearch.ad.model.DetectorProfileName; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.model.ModelProfile; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.common.settings.Settings; import org.opensearch.plugins.Plugin; import org.opensearch.test.OpenSearchIntegTestCase; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.model.Entity; import org.opensearch.transport.TransportService; public class ProfileTransportActionTests extends OpenSearchIntegTestCase { @@ -114,13 +114,13 @@ public void setUp() throws Exception { } private void setUpModelSize(int maxModel) { - Settings nodeSettings = Settings.builder().put(AnomalyDetectorSettings.MAX_MODEL_SIZE_PER_NODE.getKey(), maxModel).build(); + Settings nodeSettings = Settings.builder().put(AnomalyDetectorSettings.AD_MAX_MODEL_SIZE_PER_NODE.getKey(), maxModel).build(); internalCluster().startNode(nodeSettings); } @Override protected Collection> nodePlugins() { - return Arrays.asList(AnomalyDetectorPlugin.class); + return Arrays.asList(TimeSeriesAnalyticsPlugin.class); } @Test @@ -145,7 +145,7 @@ public void testNewNodeRequest() { ProfileNodeRequest profileNodeRequest1 = new ProfileNodeRequest(profileRequest); ProfileNodeRequest profileNodeRequest2 = action.newNodeRequest(profileRequest); - assertEquals(profileNodeRequest1.getDetectorId(), profileNodeRequest2.getDetectorId()); + assertEquals(profileNodeRequest1.getId(), profileNodeRequest2.getId()); assertEquals(profileNodeRequest2.getProfilesToBeRetrieved(), profileNodeRequest2.getProfilesToBeRetrieved()); } diff --git a/src/test/java/org/opensearch/ad/transport/RCFPollingTests.java b/src/test/java/org/opensearch/ad/transport/RCFPollingTests.java index df1cc296e..7a91fc5f9 100644 --- a/src/test/java/org/opensearch/ad/transport/RCFPollingTests.java +++ b/src/test/java/org/opensearch/ad/transport/RCFPollingTests.java @@ -28,14 +28,10 @@ import org.opensearch.Version; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.PlainActionFuture; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.cluster.HashRing; -import org.opensearch.ad.common.exception.AnomalyDetectionException; import org.opensearch.ad.common.exception.JsonPathNotFoundException; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.ml.ModelManager; -import org.opensearch.ad.ml.SingleStreamModelIdMapper; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Settings; @@ -46,6 +42,10 @@ import org.opensearch.core.xcontent.ToXContent; import org.opensearch.tasks.Task; import org.opensearch.telemetry.tracing.noop.NoopTracer; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.ml.SingleStreamModelIdMapper; import org.opensearch.transport.ConnectTransportException; import org.opensearch.transport.Transport; import org.opensearch.transport.TransportException; @@ -61,7 +61,7 @@ import test.org.opensearch.ad.util.FakeNode; import test.org.opensearch.ad.util.JsonDeserializer; -public class RCFPollingTests extends AbstractADTest { +public class RCFPollingTests extends AbstractTimeSeriesTest { Gson gson = new GsonBuilder().create(); private String detectorId = "jqIG6XIBEyaF3zCMZfcB"; private String model0Id; @@ -220,7 +220,7 @@ public void testNoNodeFoundForModel() { clusterService ); action.doExecute(mock(Task.class), request, future); - assertException(future, AnomalyDetectionException.class, RCFPollingTransportAction.NO_NODE_FOUND_MSG); + assertException(future, TimeSeriesException.class, RCFPollingTransportAction.NO_NODE_FOUND_MSG); } /** @@ -356,7 +356,7 @@ public void testResponseToXContent() throws IOException, JsonPathNotFoundExcepti public void testRequestToXContent() throws IOException, JsonPathNotFoundException { RCFPollingRequest response = new RCFPollingRequest(detectorId); String json = TestHelpers.xContentBuilderToString(response.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); - assertEquals(detectorId, JsonDeserializer.getTextValue(json, CommonName.ID_JSON_KEY)); + assertEquals(detectorId, JsonDeserializer.getTextValue(json, ADCommonName.ID_JSON_KEY)); } public void testNullDetectorId() { diff --git a/src/test/java/org/opensearch/ad/transport/RCFResultITTests.java b/src/test/java/org/opensearch/ad/transport/RCFResultITTests.java index d521eb608..92037e3ee 100644 --- a/src/test/java/org/opensearch/ad/transport/RCFResultITTests.java +++ b/src/test/java/org/opensearch/ad/transport/RCFResultITTests.java @@ -16,20 +16,20 @@ import java.util.concurrent.ExecutionException; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.AnomalyDetectorPlugin; import org.opensearch.common.action.ActionFuture; import org.opensearch.plugins.Plugin; import org.opensearch.test.OpenSearchIntegTestCase; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; public class RCFResultITTests extends OpenSearchIntegTestCase { @Override protected Collection> nodePlugins() { - return Collections.singletonList(AnomalyDetectorPlugin.class); + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); } protected Collection> transportClientPlugins() { - return Collections.singletonList(AnomalyDetectorPlugin.class); + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); } public void testEmptyFeature() throws ExecutionException, InterruptedException { diff --git a/src/test/java/org/opensearch/ad/transport/RCFResultTests.java b/src/test/java/org/opensearch/ad/transport/RCFResultTests.java index ba6926a75..fad6c9ab0 100644 --- a/src/test/java/org/opensearch/ad/transport/RCFResultTests.java +++ b/src/test/java/org/opensearch/ad/transport/RCFResultTests.java @@ -37,17 +37,14 @@ import org.opensearch.action.ActionRequestValidationException; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.PlainActionFuture; -import org.opensearch.ad.breaker.ADCircuitBreakerService; import org.opensearch.ad.cluster.HashRing; import org.opensearch.ad.common.exception.JsonPathNotFoundException; -import org.opensearch.ad.common.exception.LimitExceededException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.ml.ModelManager; import org.opensearch.ad.ml.ThresholdingResult; import org.opensearch.ad.stats.ADStat; import org.opensearch.ad.stats.ADStats; -import org.opensearch.ad.stats.StatNames; import org.opensearch.ad.stats.suppliers.CounterSupplier; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.common.io.stream.BytesStreamOutput; @@ -59,6 +56,9 @@ import org.opensearch.tasks.Task; import org.opensearch.telemetry.tracing.noop.NoopTracer; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.breaker.CircuitBreakerService; +import org.opensearch.timeseries.common.exception.LimitExceededException; +import org.opensearch.timeseries.stats.StatNames; import org.opensearch.transport.Transport; import org.opensearch.transport.TransportService; @@ -112,7 +112,7 @@ public void testNormal() { ); ModelManager manager = mock(ModelManager.class); - ADCircuitBreakerService adCircuitBreakerService = mock(ADCircuitBreakerService.class); + CircuitBreakerService adCircuitBreakerService = mock(CircuitBreakerService.class); RCFResultTransportAction action = new RCFResultTransportAction( mock(ActionFilters.class), transportService, @@ -171,7 +171,7 @@ public void testExecutionException() { ); ModelManager manager = mock(ModelManager.class); - ADCircuitBreakerService adCircuitBreakerService = mock(ADCircuitBreakerService.class); + CircuitBreakerService adCircuitBreakerService = mock(CircuitBreakerService.class); RCFResultTransportAction action = new RCFResultTransportAction( mock(ActionFilters.class), transportService, @@ -245,7 +245,7 @@ public void testJsonResponse() throws IOException, JsonPathNotFoundException { public void testEmptyID() { ActionRequestValidationException e = new RCFResultRequest(null, "123-rcf-1", new double[] { 0 }).validate(); - assertThat(e.validationErrors(), Matchers.hasItem(CommonErrorMessages.AD_ID_MISSING_MSG)); + assertThat(e.validationErrors(), Matchers.hasItem(ADCommonMessages.AD_ID_MISSING_MSG)); } public void testFeatureIsNull() { @@ -270,8 +270,8 @@ public void testJsonRequest() throws IOException, JsonPathNotFoundException { request.toXContent(builder, ToXContent.EMPTY_PARAMS); String json = builder.toString(); - assertEquals(JsonDeserializer.getTextValue(json, CommonName.ID_JSON_KEY), request.getAdID()); - assertArrayEquals(JsonDeserializer.getDoubleArrayValue(json, CommonName.FEATURE_JSON_KEY), request.getFeatures(), 0.001); + assertEquals(JsonDeserializer.getTextValue(json, ADCommonName.ID_JSON_KEY), request.getAdID()); + assertArrayEquals(JsonDeserializer.getDoubleArrayValue(json, ADCommonName.FEATURE_JSON_KEY), request.getFeatures(), 0.001); } @SuppressWarnings("unchecked") @@ -288,7 +288,7 @@ public void testCircuitBreaker() { ); ModelManager manager = mock(ModelManager.class); - ADCircuitBreakerService breakerService = mock(ADCircuitBreakerService.class); + CircuitBreakerService breakerService = mock(CircuitBreakerService.class); RCFResultTransportAction action = new RCFResultTransportAction( mock(ActionFilters.class), transportService, @@ -340,7 +340,7 @@ public void testCorruptModel() { ); ModelManager manager = mock(ModelManager.class); - ADCircuitBreakerService adCircuitBreakerService = mock(ADCircuitBreakerService.class); + CircuitBreakerService adCircuitBreakerService = mock(CircuitBreakerService.class); RCFResultTransportAction action = new RCFResultTransportAction( mock(ActionFilters.class), transportService, diff --git a/src/test/java/org/opensearch/ad/transport/SearchADTasksActionTests.java b/src/test/java/org/opensearch/ad/transport/SearchADTasksActionTests.java index 542b83902..0d7ea4d9d 100644 --- a/src/test/java/org/opensearch/ad/transport/SearchADTasksActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/SearchADTasksActionTests.java @@ -11,15 +11,15 @@ package org.opensearch.ad.transport; -import static org.opensearch.ad.TestHelpers.matchAllRequest; +import static org.opensearch.timeseries.TestHelpers.matchAllRequest; import java.io.IOException; import org.junit.Test; import org.opensearch.action.search.SearchResponse; import org.opensearch.ad.HistoricalAnalysisIntegTestCase; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.timeseries.TestHelpers; public class SearchADTasksActionTests extends HistoricalAnalysisIntegTestCase { @@ -35,7 +35,7 @@ public void testSearchADTasksAction() throws IOException { @Test public void testNoIndex() { - deleteIndexIfExists(CommonName.DETECTION_STATE_INDEX); + deleteIndexIfExists(ADCommonName.DETECTION_STATE_INDEX); SearchResponse searchResponse = client().execute(SearchADTasksAction.INSTANCE, matchAllRequest()).actionGet(10000); assertEquals(0, searchResponse.getInternalResponse().hits().getTotalHits().value); } diff --git a/src/test/java/org/opensearch/ad/transport/SearchADTasksTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/SearchADTasksTransportActionTests.java index 27edd6d63..bc87faf13 100644 --- a/src/test/java/org/opensearch/ad/transport/SearchADTasksTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/SearchADTasksTransportActionTests.java @@ -23,7 +23,7 @@ import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; import org.opensearch.ad.HistoricalAnalysisIntegTestCase; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.ADTask; import org.opensearch.common.settings.Settings; import org.opensearch.index.IndexNotFoundException; @@ -83,7 +83,7 @@ private SearchRequest searchRequest(boolean isLatest) { BoolQueryBuilder query = new BoolQueryBuilder(); query.filter(new TermQueryBuilder(ADTask.IS_LATEST_FIELD, isLatest)); sourceBuilder.query(query); - SearchRequest request = new SearchRequest().source(sourceBuilder).indices(CommonName.DETECTION_STATE_INDEX); + SearchRequest request = new SearchRequest().source(sourceBuilder).indices(ADCommonName.DETECTION_STATE_INDEX); return request; } diff --git a/src/test/java/org/opensearch/ad/transport/SearchAnomalyDetectorActionTests.java b/src/test/java/org/opensearch/ad/transport/SearchAnomalyDetectorActionTests.java index 7120c3a05..96099af31 100644 --- a/src/test/java/org/opensearch/ad/transport/SearchAnomalyDetectorActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/SearchAnomalyDetectorActionTests.java @@ -20,13 +20,14 @@ import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; import org.opensearch.ad.HistoricalAnalysisIntegTestCase; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyDetectorType; import org.opensearch.index.query.BoolQueryBuilder; import org.opensearch.index.query.MatchAllQueryBuilder; import org.opensearch.index.query.TermQueryBuilder; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.constant.CommonName; import com.google.common.collect.ImmutableList; @@ -61,7 +62,7 @@ public void testSearchDetectorAction() throws IOException { } public void testNoIndex() { - deleteIndexIfExists(AnomalyDetector.ANOMALY_DETECTORS_INDEX); + deleteIndexIfExists(CommonName.CONFIG_INDEX); BoolQueryBuilder query = new BoolQueryBuilder().filter(new MatchAllQueryBuilder()); SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder().query(query); diff --git a/src/test/java/org/opensearch/ad/transport/SearchAnomalyDetectorInfoActionTests.java b/src/test/java/org/opensearch/ad/transport/SearchAnomalyDetectorInfoActionTests.java index 4ac6ded6f..f06761bb6 100644 --- a/src/test/java/org/opensearch/ad/transport/SearchAnomalyDetectorInfoActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/SearchAnomalyDetectorInfoActionTests.java @@ -16,7 +16,7 @@ import static org.mockito.Mockito.mock; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.TestHelpers.createEmptySearchResponse; +import static org.opensearch.timeseries.TestHelpers.createEmptySearchResponse; import java.io.IOException; import java.util.Arrays; @@ -29,7 +29,6 @@ import org.opensearch.action.search.SearchResponse; import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.PlainActionFuture; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; @@ -43,6 +42,7 @@ import org.opensearch.tasks.Task; import org.opensearch.test.OpenSearchIntegTestCase; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.TestHelpers; import org.opensearch.transport.TransportService; public class SearchAnomalyDetectorInfoActionTests extends OpenSearchIntegTestCase { @@ -92,7 +92,7 @@ public void onFailure(Exception e) { clusterService = mock(ClusterService.class); ClusterSettings clusterSettings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES))) ); when(clusterService.getClusterSettings()).thenReturn(clusterSettings); } diff --git a/src/test/java/org/opensearch/ad/transport/SearchAnomalyResultActionTests.java b/src/test/java/org/opensearch/ad/transport/SearchAnomalyResultActionTests.java index 01914e81a..df7844ef1 100644 --- a/src/test/java/org/opensearch/ad/transport/SearchAnomalyResultActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/SearchAnomalyResultActionTests.java @@ -16,10 +16,10 @@ import static org.mockito.Mockito.mock; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.TestHelpers.createClusterState; -import static org.opensearch.ad.TestHelpers.createSearchResponse; -import static org.opensearch.ad.TestHelpers.matchAllRequest; -import static org.opensearch.ad.indices.AnomalyDetectionIndices.ALL_AD_RESULTS_INDEX_PATTERN; +import static org.opensearch.ad.indices.ADIndexManagement.ALL_AD_RESULTS_INDEX_PATTERN; +import static org.opensearch.timeseries.TestHelpers.createClusterState; +import static org.opensearch.timeseries.TestHelpers.createSearchResponse; +import static org.opensearch.timeseries.TestHelpers.matchAllRequest; import java.io.IOException; import java.nio.charset.StandardCharsets; @@ -39,8 +39,7 @@ import org.opensearch.action.support.IndicesOptions; import org.opensearch.action.support.PlainActionFuture; import org.opensearch.ad.HistoricalAnalysisIntegTestCase; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.transport.handler.ADSearchHandler; import org.opensearch.client.Client; @@ -61,6 +60,7 @@ import org.opensearch.tasks.Task; import org.opensearch.telemetry.tracing.noop.NoopTracer; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.TestHelpers; import org.opensearch.transport.Transport; import org.opensearch.transport.TransportService; @@ -89,7 +89,7 @@ public void setUp() throws Exception { clusterService = mock(ClusterService.class); ClusterSettings clusterSettings = new ClusterSettings( Settings.EMPTY, - Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES))) + Collections.unmodifiableSet(new HashSet<>(Arrays.asList(AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES))) ); when(clusterService.getClusterSettings()).thenReturn(clusterSettings); clusterState = createClusterState(); @@ -299,7 +299,7 @@ public void testSearchResultAction() throws IOException { @Test public void testNoIndex() { - deleteIndexIfExists(CommonName.ANOMALY_RESULT_INDEX_ALIAS); + deleteIndexIfExists(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS); SearchResponse searchResponse = client() .execute(SearchAnomalyResultAction.INSTANCE, matchAllRequest().indices(ALL_AD_RESULTS_INDEX_PATTERN)) .actionGet(10000); diff --git a/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultActionTests.java b/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultActionTests.java index 7669b8c5b..1e214f209 100644 --- a/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultActionTests.java @@ -15,10 +15,10 @@ import org.junit.Before; import org.opensearch.ad.HistoricalAnalysisIntegTestCase; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.common.settings.Settings; import org.opensearch.test.OpenSearchIntegTestCase; +import org.opensearch.timeseries.TestHelpers; import com.google.common.collect.ImmutableList; diff --git a/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultRequestTests.java b/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultRequestTests.java index b946ebe4e..d227a0392 100644 --- a/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultRequestTests.java +++ b/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultRequestTests.java @@ -14,11 +14,11 @@ import org.junit.Assert; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.TestHelpers; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.TestHelpers; public class SearchTopAnomalyResultRequestTests extends OpenSearchTestCase { @@ -38,7 +38,7 @@ public void testSerialization() throws IOException { originalRequest.writeTo(output); StreamInput input = output.bytes().streamInput(); SearchTopAnomalyResultRequest parsedRequest = new SearchTopAnomalyResultRequest(input); - assertEquals(originalRequest.getDetectorId(), parsedRequest.getDetectorId()); + assertEquals(originalRequest.getId(), parsedRequest.getId()); assertEquals(originalRequest.getTaskId(), parsedRequest.getTaskId()); assertEquals(originalRequest.getHistorical(), parsedRequest.getHistorical()); assertEquals(originalRequest.getSize(), parsedRequest.getSize()); @@ -78,7 +78,7 @@ public void testParse() throws IOException { assertEquals(order, parsedRequest.getOrder()); assertEquals(startTime.toEpochMilli(), parsedRequest.getStartTime().toEpochMilli()); assertEquals(endTime.toEpochMilli(), parsedRequest.getEndTime().toEpochMilli()); - assertEquals(detectorId, parsedRequest.getDetectorId()); + assertEquals(detectorId, parsedRequest.getId()); assertEquals(historical, parsedRequest.getHistorical()); } diff --git a/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultResponseTests.java b/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultResponseTests.java index 26c7be7c2..a0a08087e 100644 --- a/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultResponseTests.java +++ b/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultResponseTests.java @@ -9,10 +9,10 @@ import java.util.ArrayList; import java.util.Arrays; -import org.opensearch.ad.TestHelpers; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.common.io.stream.StreamInput; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.TestHelpers; public class SearchTopAnomalyResultResponseTests extends OpenSearchTestCase { diff --git a/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultTransportActionTests.java index d1bac4c2a..a7b785f69 100644 --- a/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/SearchTopAnomalyResultTransportActionTests.java @@ -27,8 +27,7 @@ import org.opensearch.action.search.ShardSearchFailure; import org.opensearch.action.support.ActionFilters; import org.opensearch.ad.ADIntegTestCase; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.AnomalyResultBucket; import org.opensearch.ad.transport.handler.ADSearchHandler; import org.opensearch.client.Client; @@ -39,6 +38,7 @@ import org.opensearch.search.aggregations.bucket.composite.CompositeAggregation; import org.opensearch.search.aggregations.metrics.InternalMax; import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.timeseries.TestHelpers; import org.opensearch.transport.TransportService; import com.google.common.collect.ImmutableList; @@ -93,7 +93,7 @@ public void setUp() throws Exception { } public void testSearchOnNonExistingResultIndex() throws IOException { - deleteIndexIfExists(CommonName.ANOMALY_RESULT_INDEX_ALIAS); + deleteIndexIfExists(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS); String testIndexName = randomAlphaOfLength(10).toLowerCase(Locale.ROOT); ImmutableList categoryFields = ImmutableList.of("test-field-1", "test-field-2"); String detectorId = createDetector( diff --git a/src/test/java/org/opensearch/ad/transport/StatsAnomalyDetectorTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/StatsAnomalyDetectorTransportActionTests.java index 4f5054132..7c877c086 100644 --- a/src/test/java/org/opensearch/ad/transport/StatsAnomalyDetectorTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/StatsAnomalyDetectorTransportActionTests.java @@ -16,9 +16,9 @@ import org.junit.Before; import org.opensearch.ad.ADIntegTestCase; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.stats.InternalStatNames; -import org.opensearch.ad.stats.StatNames; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.stats.StatNames; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; diff --git a/src/test/java/org/opensearch/ad/transport/ThresholdResultITTests.java b/src/test/java/org/opensearch/ad/transport/ThresholdResultITTests.java index 8ca0a2c51..59b765cbe 100644 --- a/src/test/java/org/opensearch/ad/transport/ThresholdResultITTests.java +++ b/src/test/java/org/opensearch/ad/transport/ThresholdResultITTests.java @@ -16,20 +16,20 @@ import java.util.concurrent.ExecutionException; import org.opensearch.action.ActionRequestValidationException; -import org.opensearch.ad.AnomalyDetectorPlugin; import org.opensearch.common.action.ActionFuture; import org.opensearch.plugins.Plugin; import org.opensearch.test.OpenSearchIntegTestCase; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; public class ThresholdResultITTests extends OpenSearchIntegTestCase { @Override protected Collection> nodePlugins() { - return Collections.singletonList(AnomalyDetectorPlugin.class); + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); } protected Collection> transportClientPlugins() { - return Collections.singletonList(AnomalyDetectorPlugin.class); + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); } public void testEmptyID() throws ExecutionException, InterruptedException { diff --git a/src/test/java/org/opensearch/ad/transport/ThresholdResultTests.java b/src/test/java/org/opensearch/ad/transport/ThresholdResultTests.java index 5b277a545..4457f0fb7 100644 --- a/src/test/java/org/opensearch/ad/transport/ThresholdResultTests.java +++ b/src/test/java/org/opensearch/ad/transport/ThresholdResultTests.java @@ -27,8 +27,8 @@ import org.opensearch.action.support.ActionFilters; import org.opensearch.action.support.PlainActionFuture; import org.opensearch.ad.common.exception.JsonPathNotFoundException; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.ml.ModelManager; import org.opensearch.ad.ml.ThresholdingResult; import org.opensearch.common.io.stream.BytesStreamOutput; @@ -120,18 +120,18 @@ public void testJsonResponse() throws IOException, JsonPathNotFoundException { response.toXContent(builder, ToXContent.EMPTY_PARAMS); String json = builder.toString(); - assertEquals(JsonDeserializer.getDoubleValue(json, CommonName.ANOMALY_GRADE_JSON_KEY), response.getAnomalyGrade(), 0.001); - assertEquals(JsonDeserializer.getDoubleValue(json, CommonName.CONFIDENCE_JSON_KEY), response.getConfidence(), 0.001); + assertEquals(JsonDeserializer.getDoubleValue(json, ADCommonName.ANOMALY_GRADE_JSON_KEY), response.getAnomalyGrade(), 0.001); + assertEquals(JsonDeserializer.getDoubleValue(json, ADCommonName.CONFIDENCE_JSON_KEY), response.getConfidence(), 0.001); } public void testEmptyDetectorID() { ActionRequestValidationException e = new ThresholdResultRequest(null, "123-threshold", 2).validate(); - assertThat(e.validationErrors(), Matchers.hasItem(CommonErrorMessages.AD_ID_MISSING_MSG)); + assertThat(e.validationErrors(), Matchers.hasItem(ADCommonMessages.AD_ID_MISSING_MSG)); } public void testEmptyModelID() { ActionRequestValidationException e = new ThresholdResultRequest("123", "", 2).validate(); - assertThat(e.validationErrors(), Matchers.hasItem(CommonErrorMessages.MODEL_ID_MISSING_MSG)); + assertThat(e.validationErrors(), Matchers.hasItem(ADCommonMessages.MODEL_ID_MISSING_MSG)); } public void testSerialzationRequest() throws IOException { @@ -151,7 +151,7 @@ public void testJsonRequest() throws IOException, JsonPathNotFoundException { request.toXContent(builder, ToXContent.EMPTY_PARAMS); String json = builder.toString(); - assertEquals(JsonDeserializer.getTextValue(json, CommonName.ID_JSON_KEY), request.getAdID()); - assertEquals(JsonDeserializer.getDoubleValue(json, CommonName.RCF_SCORE_JSON_KEY), request.getRCFScore(), 0.001); + assertEquals(JsonDeserializer.getTextValue(json, ADCommonName.ID_JSON_KEY), request.getAdID()); + assertEquals(JsonDeserializer.getDoubleValue(json, ADCommonName.RCF_SCORE_JSON_KEY), request.getRCFScore(), 0.001); } } diff --git a/src/test/java/org/opensearch/ad/transport/ValidateAnomalyDetectorRequestTests.java b/src/test/java/org/opensearch/ad/transport/ValidateAnomalyDetectorRequestTests.java index 40cece87b..4a1fae9cb 100644 --- a/src/test/java/org/opensearch/ad/transport/ValidateAnomalyDetectorRequestTests.java +++ b/src/test/java/org/opensearch/ad/transport/ValidateAnomalyDetectorRequestTests.java @@ -15,13 +15,13 @@ import java.time.Instant; import org.junit.Test; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.common.unit.TimeValue; import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; import org.opensearch.core.common.io.stream.NamedWriteableRegistry; import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; import com.google.common.collect.ImmutableMap; diff --git a/src/test/java/org/opensearch/ad/transport/ValidateAnomalyDetectorResponseTests.java b/src/test/java/org/opensearch/ad/transport/ValidateAnomalyDetectorResponseTests.java index 6899b453d..510ed2683 100644 --- a/src/test/java/org/opensearch/ad/transport/ValidateAnomalyDetectorResponseTests.java +++ b/src/test/java/org/opensearch/ad/transport/ValidateAnomalyDetectorResponseTests.java @@ -16,14 +16,14 @@ import java.util.Map; import org.junit.Test; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonErrorMessages; import org.opensearch.ad.model.DetectorValidationIssue; import org.opensearch.common.io.stream.BytesStreamOutput; import org.opensearch.core.common.io.stream.StreamInput; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.constant.CommonMessages; -public class ValidateAnomalyDetectorResponseTests extends AbstractADTest { +public class ValidateAnomalyDetectorResponseTests extends AbstractTimeSeriesTest { @Test public void testResponseSerialization() throws IOException { @@ -84,7 +84,7 @@ public void testResponseToXContentWithIntervalRec() throws IOException { String validationResponse = TestHelpers.xContentBuilderToString(response.toXContent(TestHelpers.builder())); assertEquals( "{\"model\":{\"detection_interval\":{\"message\":\"" - + CommonErrorMessages.DETECTOR_INTERVAL_REC + + CommonMessages.INTERVAL_REC + intervalRec + "\",\"suggested_value\":{\"period\":{\"interval\":5,\"unit\":\"Minutes\"}}}}}", validationResponse diff --git a/src/test/java/org/opensearch/ad/transport/ValidateAnomalyDetectorTransportActionTests.java b/src/test/java/org/opensearch/ad/transport/ValidateAnomalyDetectorTransportActionTests.java index 71fb84701..604fc2c46 100644 --- a/src/test/java/org/opensearch/ad/transport/ValidateAnomalyDetectorTransportActionTests.java +++ b/src/test/java/org/opensearch/ad/transport/ValidateAnomalyDetectorTransportActionTests.java @@ -19,17 +19,18 @@ import org.junit.Test; import org.opensearch.ad.ADIntegTestCase; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.constant.ADCommonMessages; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.DetectorValidationIssueType; -import org.opensearch.ad.model.Feature; -import org.opensearch.ad.model.ValidationAspect; -import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.common.unit.TimeValue; import org.opensearch.search.aggregations.AggregationBuilder; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.ValidationAspect; +import org.opensearch.timeseries.model.ValidationIssueType; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import com.google.common.base.Charsets; import com.google.common.collect.ImmutableList; @@ -68,9 +69,9 @@ public void testValidateAnomalyDetectorWithNoIndexFound() throws IOException { ); ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); assertNotNull(response.getIssue()); - assertEquals(DetectorValidationIssueType.INDICES, response.getIssue().getType()); + assertEquals(ValidationIssueType.INDICES, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); - assertTrue(response.getIssue().getMessage().contains(CommonErrorMessages.INDEX_NOT_FOUND)); + assertTrue(response.getIssue().getMessage().contains(ADCommonMessages.INDEX_NOT_FOUND)); } @Test @@ -89,7 +90,7 @@ public void testValidateAnomalyDetectorWithDuplicateName() throws IOException { ); ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); assertNotNull(response.getIssue()); - assertEquals(DetectorValidationIssueType.NAME, response.getIssue().getType()); + assertEquals(ValidationIssueType.NAME, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); } @@ -108,11 +109,11 @@ public void testValidateAnomalyDetectorWithNonExistingFeatureField() throws IOEx ); ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); assertNotNull(response.getIssue()); - assertEquals(DetectorValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); + assertEquals(ValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); - assertTrue(response.getIssue().getMessage().contains(CommonErrorMessages.FEATURE_WITH_EMPTY_DATA_MSG)); + assertTrue(response.getIssue().getMessage().contains(CommonMessages.FEATURE_WITH_EMPTY_DATA_MSG)); assertTrue(response.getIssue().getSubIssues().containsKey(maxFeature.getName())); - assertTrue(CommonErrorMessages.FEATURE_WITH_EMPTY_DATA_MSG.contains(response.getIssue().getSubIssues().get(maxFeature.getName()))); + assertTrue(CommonMessages.FEATURE_WITH_EMPTY_DATA_MSG.contains(response.getIssue().getSubIssues().get(maxFeature.getName()))); } @Test @@ -132,8 +133,8 @@ public void testValidateAnomalyDetectorWithDuplicateFeatureAggregationNames() th ); ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); assertNotNull(response.getIssue()); - assertTrue(response.getIssue().getMessage().contains("Detector has duplicate feature aggregation query names:")); - assertEquals(DetectorValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); + assertTrue(response.getIssue().getMessage().contains("Config has duplicate feature aggregation query names:")); + assertEquals(ValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); } @@ -154,9 +155,9 @@ public void testValidateAnomalyDetectorWithDuplicateFeatureNamesAndDuplicateAggr ); ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); assertNotNull(response.getIssue()); - assertTrue(response.getIssue().getMessage().contains("Detector has duplicate feature aggregation query names:")); - assertTrue(response.getIssue().getMessage().contains("Detector has duplicate feature names:")); - assertEquals(DetectorValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); + assertTrue(response.getIssue().getMessage().contains("Config has duplicate feature aggregation query names:")); + assertTrue(response.getIssue().getMessage().contains("There are duplicate feature names:")); + assertEquals(ValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); } @@ -177,8 +178,11 @@ public void testValidateAnomalyDetectorWithDuplicateFeatureNames() throws IOExce ); ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); assertNotNull(response.getIssue()); - assertTrue(response.getIssue().getMessage().contains("Detector has duplicate feature names:")); - assertEquals(DetectorValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); + assertTrue( + "actual: " + response.getIssue().getMessage(), + response.getIssue().getMessage().contains("There are duplicate feature names:") + ); + assertEquals(ValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); } @@ -197,13 +201,11 @@ public void testValidateAnomalyDetectorWithInvalidFeatureField() throws IOExcept ); ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); assertNotNull(response.getIssue()); - assertEquals(DetectorValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); + assertEquals(ValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); - assertTrue(response.getIssue().getMessage().contains(CommonErrorMessages.FEATURE_WITH_INVALID_QUERY_MSG)); + assertTrue(response.getIssue().getMessage().contains(CommonMessages.FEATURE_WITH_INVALID_QUERY_MSG)); assertTrue(response.getIssue().getSubIssues().containsKey(maxFeature.getName())); - assertTrue( - CommonErrorMessages.FEATURE_WITH_INVALID_QUERY_MSG.contains(response.getIssue().getSubIssues().get(maxFeature.getName())) - ); + assertTrue(CommonMessages.FEATURE_WITH_INVALID_QUERY_MSG.contains(response.getIssue().getSubIssues().get(maxFeature.getName()))); } @Test @@ -226,9 +228,9 @@ public void testValidateAnomalyDetectorWithUnknownFeatureField() throws IOExcept ); ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); assertNotNull(response.getIssue()); - assertEquals(DetectorValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); + assertEquals(ValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); - assertTrue(response.getIssue().getMessage().contains(CommonErrorMessages.UNKNOWN_SEARCH_QUERY_EXCEPTION_MSG)); + assertTrue(response.getIssue().getMessage().contains(CommonMessages.UNKNOWN_SEARCH_QUERY_EXCEPTION_MSG)); assertTrue(response.getIssue().getSubIssues().containsKey(nameField)); } @@ -250,18 +252,16 @@ public void testValidateAnomalyDetectorWithMultipleInvalidFeatureField() throws ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); assertNotNull(response.getIssue()); assertEquals(response.getIssue().getSubIssues().keySet().size(), 2); - assertEquals(DetectorValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); + assertEquals(ValidationIssueType.FEATURE_ATTRIBUTES, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); - assertTrue(response.getIssue().getMessage().contains(CommonErrorMessages.FEATURE_WITH_INVALID_QUERY_MSG)); + assertTrue(response.getIssue().getMessage().contains(CommonMessages.FEATURE_WITH_INVALID_QUERY_MSG)); assertTrue(response.getIssue().getSubIssues().containsKey(maxFeature.getName())); - assertTrue( - CommonErrorMessages.FEATURE_WITH_INVALID_QUERY_MSG.contains(response.getIssue().getSubIssues().get(maxFeature.getName())) - ); + assertTrue(CommonMessages.FEATURE_WITH_INVALID_QUERY_MSG.contains(response.getIssue().getSubIssues().get(maxFeature.getName()))); } @Test public void testValidateAnomalyDetectorWithCustomResultIndex() throws IOException { - String resultIndex = CommonName.CUSTOM_RESULT_INDEX_PREFIX + "test"; + String resultIndex = ADCommonName.CUSTOM_RESULT_INDEX_PREFIX + "test"; createCustomADResultIndex(resultIndex); AnomalyDetector anomalyDetector = TestHelpers .randomDetector( @@ -298,8 +298,8 @@ public void testValidateAnomalyDetectorWithCustomResultIndexPresentButNotCreated @Test public void testValidateAnomalyDetectorWithCustomResultIndexWithInvalidMapping() throws IOException { - String resultIndex = CommonName.CUSTOM_RESULT_INDEX_PREFIX + "test"; - URL url = AnomalyDetectionIndices.class.getClassLoader().getResource("mappings/checkpoint.json"); + String resultIndex = ADCommonName.CUSTOM_RESULT_INDEX_PREFIX + "test"; + URL url = ADIndexManagement.class.getClassLoader().getResource("mappings/anomaly-checkpoint.json"); createIndex(resultIndex, Resources.toString(url, Charsets.UTF_8)); AnomalyDetector anomalyDetector = TestHelpers .randomDetector( @@ -320,13 +320,13 @@ public void testValidateAnomalyDetectorWithCustomResultIndexWithInvalidMapping() new TimeValue(5_000L) ); ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); - assertEquals(DetectorValidationIssueType.RESULT_INDEX, response.getIssue().getType()); + assertEquals(ValidationIssueType.RESULT_INDEX, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); - assertTrue(response.getIssue().getMessage().contains(CommonErrorMessages.INVALID_RESULT_INDEX_MAPPING)); + assertTrue(response.getIssue().getMessage().contains(CommonMessages.INVALID_RESULT_INDEX_MAPPING)); } private void testValidateAnomalyDetectorWithCustomResultIndex(boolean resultIndexCreated) throws IOException { - String resultIndex = CommonName.CUSTOM_RESULT_INDEX_PREFIX + "test"; + String resultIndex = ADCommonName.CUSTOM_RESULT_INDEX_PREFIX + "test"; if (resultIndexCreated) { createCustomADResultIndex(resultIndex); } @@ -365,13 +365,14 @@ public void testValidateAnomalyDetectorWithInvalidDetectorName() throws IOExcept TestHelpers.randomQuery(), TestHelpers.randomIntervalTimeConfiguration(), TestHelpers.randomIntervalTimeConfiguration(), - AnomalyDetectorSettings.DEFAULT_SHINGLE_SIZE, + TimeSeriesSettings.DEFAULT_SHINGLE_SIZE, null, 1, Instant.now(), null, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ); ingestTestDataValidate(anomalyDetector.getIndices().get(0), Instant.now().minus(1, ChronoUnit.DAYS), 1, "error"); ValidateAnomalyDetectorRequest request = new ValidateAnomalyDetectorRequest( @@ -383,9 +384,9 @@ public void testValidateAnomalyDetectorWithInvalidDetectorName() throws IOExcept new TimeValue(5_000L) ); ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); - assertEquals(DetectorValidationIssueType.NAME, response.getIssue().getType()); + assertEquals(ValidationIssueType.NAME, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); - assertEquals(CommonErrorMessages.INVALID_DETECTOR_NAME, response.getIssue().getMessage()); + assertEquals(CommonMessages.INVALID_NAME, response.getIssue().getMessage()); } @Test @@ -401,13 +402,14 @@ public void testValidateAnomalyDetectorWithDetectorNameTooLong() throws IOExcept TestHelpers.randomQuery(), TestHelpers.randomIntervalTimeConfiguration(), TestHelpers.randomIntervalTimeConfiguration(), - AnomalyDetectorSettings.DEFAULT_SHINGLE_SIZE, + TimeSeriesSettings.DEFAULT_SHINGLE_SIZE, null, 1, Instant.now(), null, TestHelpers.randomUser(), - null + null, + TestHelpers.randomImputationOption() ); ingestTestDataValidate(anomalyDetector.getIndices().get(0), Instant.now().minus(1, ChronoUnit.DAYS), 1, "error"); ValidateAnomalyDetectorRequest request = new ValidateAnomalyDetectorRequest( @@ -419,7 +421,7 @@ public void testValidateAnomalyDetectorWithDetectorNameTooLong() throws IOExcept new TimeValue(5_000L) ); ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); - assertEquals(DetectorValidationIssueType.NAME, response.getIssue().getType()); + assertEquals(ValidationIssueType.NAME, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); assertTrue(response.getIssue().getMessage().contains("Name should be shortened. The maximum limit is")); } @@ -437,10 +439,10 @@ public void testValidateAnomalyDetectorWithNonExistentTimefield() throws IOExcep new TimeValue(5_000L) ); ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); - assertEquals(DetectorValidationIssueType.TIMEFIELD_FIELD, response.getIssue().getType()); + assertEquals(ValidationIssueType.TIMEFIELD_FIELD, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); assertEquals( - String.format(Locale.ROOT, CommonErrorMessages.NON_EXISTENT_TIMESTAMP, anomalyDetector.getTimeField()), + String.format(Locale.ROOT, CommonMessages.NON_EXISTENT_TIMESTAMP, anomalyDetector.getTimeField()), response.getIssue().getMessage() ); } @@ -458,10 +460,10 @@ public void testValidateAnomalyDetectorWithNonDateTimeField() throws IOException new TimeValue(5_000L) ); ValidateAnomalyDetectorResponse response = client().execute(ValidateAnomalyDetectorAction.INSTANCE, request).actionGet(5_000); - assertEquals(DetectorValidationIssueType.TIMEFIELD_FIELD, response.getIssue().getType()); + assertEquals(ValidationIssueType.TIMEFIELD_FIELD, response.getIssue().getType()); assertEquals(ValidationAspect.DETECTOR, response.getIssue().getAspect()); assertEquals( - String.format(Locale.ROOT, CommonErrorMessages.INVALID_TIMESTAMP, anomalyDetector.getTimeField()), + String.format(Locale.ROOT, CommonMessages.INVALID_TIMESTAMP, anomalyDetector.getTimeField()), response.getIssue().getMessage() ); } diff --git a/src/test/java/org/opensearch/ad/transport/handler/ADSearchHandlerTests.java b/src/test/java/org/opensearch/ad/transport/handler/ADSearchHandlerTests.java index 7fd9b98dd..441060c4f 100644 --- a/src/test/java/org/opensearch/ad/transport/handler/ADSearchHandlerTests.java +++ b/src/test/java/org/opensearch/ad/transport/handler/ADSearchHandlerTests.java @@ -17,8 +17,8 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.TestHelpers.matchAllRequest; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.FILTER_BY_BACKEND_ROLES; +import static org.opensearch.ad.settings.AnomalyDetectorSettings.AD_FILTER_BY_BACKEND_ROLES; +import static org.opensearch.timeseries.TestHelpers.matchAllRequest; import org.junit.Before; import org.opensearch.action.search.SearchRequest; @@ -50,8 +50,8 @@ public class ADSearchHandlerTests extends ADUnitTestCase { @Override public void setUp() throws Exception { super.setUp(); - settings = Settings.builder().put(FILTER_BY_BACKEND_ROLES.getKey(), false).build(); - clusterSettings = clusterSetting(settings, FILTER_BY_BACKEND_ROLES); + settings = Settings.builder().put(AD_FILTER_BY_BACKEND_ROLES.getKey(), false).build(); + clusterSettings = clusterSetting(settings, AD_FILTER_BY_BACKEND_ROLES); clusterService = new ClusterService(settings, clusterSettings, null); client = mock(Client.class); searchHandler = new ADSearchHandler(settings, clusterService, client); @@ -74,7 +74,7 @@ public void testSearchException() { } public void testFilterEnabledWithWrongSearch() { - settings = Settings.builder().put(FILTER_BY_BACKEND_ROLES.getKey(), true).build(); + settings = Settings.builder().put(AD_FILTER_BY_BACKEND_ROLES.getKey(), true).build(); clusterService = new ClusterService(settings, clusterSettings, null); searchHandler = new ADSearchHandler(settings, clusterService, client); @@ -83,7 +83,7 @@ public void testFilterEnabledWithWrongSearch() { } public void testFilterEnabled() { - settings = Settings.builder().put(FILTER_BY_BACKEND_ROLES.getKey(), true).build(); + settings = Settings.builder().put(AD_FILTER_BY_BACKEND_ROLES.getKey(), true).build(); clusterService = new ClusterService(settings, clusterSettings, null); searchHandler = new ADSearchHandler(settings, clusterService, client); diff --git a/src/test/java/org/opensearch/ad/transport/handler/AbstractIndexHandlerTest.java b/src/test/java/org/opensearch/ad/transport/handler/AbstractIndexHandlerTest.java index 3851a7fd4..12f966ffe 100644 --- a/src/test/java/org/opensearch/ad/transport/handler/AbstractIndexHandlerTest.java +++ b/src/test/java/org/opensearch/ad/transport/handler/AbstractIndexHandlerTest.java @@ -14,7 +14,7 @@ import static org.mockito.ArgumentMatchers.any; import static org.mockito.Mockito.doAnswer; import static org.mockito.Mockito.when; -import static org.opensearch.ad.TestHelpers.createIndexBlockedState; +import static org.opensearch.timeseries.TestHelpers.createIndexBlockedState; import java.io.IOException; import java.util.Arrays; @@ -26,14 +26,10 @@ import org.mockito.MockitoAnnotations; import org.opensearch.ResourceAlreadyExistsException; import org.opensearch.action.admin.indices.create.CreateIndexResponse; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.constant.ADCommonName; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.transport.AnomalyResultTests; -import org.opensearch.ad.util.ClientUtil; import org.opensearch.ad.util.IndexUtils; -import org.opensearch.ad.util.Throttler; import org.opensearch.client.Client; import org.opensearch.cluster.ClusterState; import org.opensearch.cluster.metadata.IndexMetadata; @@ -43,8 +39,11 @@ import org.opensearch.common.unit.TimeValue; import org.opensearch.core.action.ActionListener; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.util.ClientUtil; -public abstract class AbstractIndexHandlerTest extends AbstractADTest { +public abstract class AbstractIndexHandlerTest extends AbstractTimeSeriesTest { enum IndexCreation { RUNTIME_EXCEPTION, RESOURCE_EXISTS_EXCEPTION, @@ -62,10 +61,7 @@ enum IndexCreation { protected Client client; @Mock - protected AnomalyDetectionIndices anomalyDetectionIndices; - - @Mock - protected Throttler throttler; + protected ADIndexManagement anomalyDetectionIndices; @Mock protected ClusterService clusterService; @@ -95,7 +91,7 @@ public void setUp() throws Exception { MockitoAnnotations.initMocks(this); setWriteBlockAdResultIndex(false); context = TestHelpers.createThreadPool(); - clientUtil = new ClientUtil(settings, client, throttler, context); + clientUtil = new ClientUtil(client); indexUtil = new IndexUtils(client, clientUtil, clusterService, indexNameResolver); } @@ -104,7 +100,7 @@ protected void setWriteBlockAdResultIndex(boolean blocked) { Settings settings = blocked ? Settings.builder().put(IndexMetadata.INDEX_BLOCKS_WRITE_SETTING.getKey(), true).build() : Settings.EMPTY; - ClusterState blockedClusterState = createIndexBlockedState(indexName, settings, CommonName.ANOMALY_RESULT_INDEX_ALIAS); + ClusterState blockedClusterState = createIndexBlockedState(indexName, settings, ADCommonName.ANOMALY_RESULT_INDEX_ALIAS); when(clusterService.state()).thenReturn(blockedClusterState); when(indexNameResolver.concreteIndexNames(any(), any(), any(String.class))).thenReturn(new String[] { indexName }); } @@ -124,21 +120,21 @@ protected void setUpSavingAnomalyResultIndex(boolean anomalyResultIndexExists, I listener.onFailure(new RuntimeException()); break; case RESOURCE_EXISTS_EXCEPTION: - listener.onFailure(new ResourceAlreadyExistsException(CommonName.ANOMALY_RESULT_INDEX_ALIAS)); + listener.onFailure(new ResourceAlreadyExistsException(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS)); break; case ACKED: - listener.onResponse(new CreateIndexResponse(true, true, CommonName.ANOMALY_RESULT_INDEX_ALIAS)); + listener.onResponse(new CreateIndexResponse(true, true, ADCommonName.ANOMALY_RESULT_INDEX_ALIAS)); break; case NOT_ACKED: - listener.onResponse(new CreateIndexResponse(false, false, CommonName.ANOMALY_RESULT_INDEX_ALIAS)); + listener.onResponse(new CreateIndexResponse(false, false, ADCommonName.ANOMALY_RESULT_INDEX_ALIAS)); break; default: assertTrue("should not reach here", false); break; } return null; - }).when(anomalyDetectionIndices).initDefaultAnomalyResultIndexDirectly(any()); - when(anomalyDetectionIndices.doesDefaultAnomalyResultIndexExist()).thenReturn(anomalyResultIndexExists); + }).when(anomalyDetectionIndices).initDefaultResultIndexDirectly(any()); + when(anomalyDetectionIndices.doesDefaultResultIndexExist()).thenReturn(anomalyResultIndexExists); } protected void setUpSavingAnomalyResultIndex(boolean anomalyResultIndexExists) throws IOException { diff --git a/src/test/java/org/opensearch/ad/transport/handler/AnomalyResultBulkIndexHandlerTests.java b/src/test/java/org/opensearch/ad/transport/handler/AnomalyResultBulkIndexHandlerTests.java index 076fa7068..68699b74e 100644 --- a/src/test/java/org/opensearch/ad/transport/handler/AnomalyResultBulkIndexHandlerTests.java +++ b/src/test/java/org/opensearch/ad/transport/handler/AnomalyResultBulkIndexHandlerTests.java @@ -20,10 +20,11 @@ import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; -import static org.opensearch.ad.constant.CommonName.ANOMALY_RESULT_INDEX_ALIAS; +import static org.opensearch.ad.constant.ADCommonName.ANOMALY_RESULT_INDEX_ALIAS; import java.io.IOException; import java.time.Clock; +import java.util.Optional; import org.opensearch.action.DocWriteRequest; import org.opensearch.action.admin.indices.create.CreateIndexResponse; @@ -33,12 +34,9 @@ import org.opensearch.action.bulk.BulkResponse; import org.opensearch.action.index.IndexResponse; import org.opensearch.ad.ADUnitTestCase; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.model.AnomalyResult; -import org.opensearch.ad.util.ClientUtil; import org.opensearch.ad.util.IndexUtils; -import org.opensearch.ad.util.Throttler; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.Settings; @@ -47,6 +45,8 @@ import org.opensearch.core.index.shard.ShardId; import org.opensearch.index.engine.VersionConflictEngineException; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.util.ClientUtil; import com.google.common.collect.ImmutableList; @@ -56,18 +56,17 @@ public class AnomalyResultBulkIndexHandlerTests extends ADUnitTestCase { private Client client; private IndexUtils indexUtils; private ActionListener listener; - private AnomalyDetectionIndices anomalyDetectionIndices; + private ADIndexManagement anomalyDetectionIndices; @Override public void setUp() throws Exception { super.setUp(); - anomalyDetectionIndices = mock(AnomalyDetectionIndices.class); + anomalyDetectionIndices = mock(ADIndexManagement.class); client = mock(Client.class); Settings settings = Settings.EMPTY; Clock clock = mock(Clock.class); - Throttler throttler = new Throttler(clock); ThreadPool threadpool = mock(ThreadPool.class); - ClientUtil clientUtil = new ClientUtil(Settings.EMPTY, client, throttler, threadpool); + ClientUtil clientUtil = new ClientUtil(client); indexUtils = mock(IndexUtils.class); ClusterService clusterService = mock(ClusterService.class); ThreadPool threadPool = mock(ThreadPool.class); @@ -92,13 +91,13 @@ public void onFailure(Exception e) {} public void testNullAnomalyResults() { bulkIndexHandler.bulkIndexAnomalyResult(null, null, listener); verify(listener, times(1)).onResponse(null); - verify(anomalyDetectionIndices, never()).doesAnomalyDetectorIndexExist(); + verify(anomalyDetectionIndices, never()).doesConfigIndexExist(); } public void testAnomalyResultBulkIndexHandler_IndexNotExist() { when(anomalyDetectionIndices.doesIndexExist("testIndex")).thenReturn(false); AnomalyResult anomalyResult = mock(AnomalyResult.class); - when(anomalyResult.getDetectorId()).thenReturn("testId"); + when(anomalyResult.getConfigId()).thenReturn("testId"); bulkIndexHandler.bulkIndexAnomalyResult("testIndex", ImmutableList.of(anomalyResult), listener); verify(listener, times(1)).onFailure(exceptionCaptor.capture()); @@ -109,7 +108,7 @@ public void testAnomalyResultBulkIndexHandler_InValidResultIndexMapping() { when(anomalyDetectionIndices.doesIndexExist("testIndex")).thenReturn(true); when(anomalyDetectionIndices.isValidResultIndexMapping("testIndex")).thenReturn(false); AnomalyResult anomalyResult = mock(AnomalyResult.class); - when(anomalyResult.getDetectorId()).thenReturn("testId"); + when(anomalyResult.getConfigId()).thenReturn("testId"); bulkIndexHandler.bulkIndexAnomalyResult("testIndex", ImmutableList.of(anomalyResult), listener); verify(listener, times(1)).onFailure(exceptionCaptor.capture()); @@ -120,7 +119,7 @@ public void testAnomalyResultBulkIndexHandler_FailBulkIndexAnomaly() throws IOEx when(anomalyDetectionIndices.doesIndexExist("testIndex")).thenReturn(true); when(anomalyDetectionIndices.isValidResultIndexMapping("testIndex")).thenReturn(true); AnomalyResult anomalyResult = mock(AnomalyResult.class); - when(anomalyResult.getDetectorId()).thenReturn("testId"); + when(anomalyResult.getConfigId()).thenReturn("testId"); when(anomalyResult.toXContent(any(), any())).thenThrow(new RuntimeException()); bulkIndexHandler.bulkIndexAnomalyResult("testIndex", ImmutableList.of(anomalyResult), listener); @@ -133,7 +132,7 @@ public void testCreateADResultIndexNotAcknowledged() throws IOException { ActionListener listener = invocation.getArgument(0); listener.onResponse(new CreateIndexResponse(false, false, ANOMALY_RESULT_INDEX_ALIAS)); return null; - }).when(anomalyDetectionIndices).initDefaultAnomalyResultIndexDirectly(any()); + }).when(anomalyDetectionIndices).initDefaultResultIndexDirectly(any()); bulkIndexHandler.bulkIndexAnomalyResult(null, ImmutableList.of(mock(AnomalyResult.class)), listener); verify(listener, times(1)).onFailure(exceptionCaptor.capture()); assertEquals("Creating anomaly result index with mappings call not acknowledged", exceptionCaptor.getValue().getMessage()); @@ -142,7 +141,7 @@ public void testCreateADResultIndexNotAcknowledged() throws IOException { public void testWrongAnomalyResult() { BulkRequestBuilder bulkRequestBuilder = new BulkRequestBuilder(client, BulkAction.INSTANCE); doReturn(bulkRequestBuilder).when(client).prepareBulk(); - doReturn(true).when(anomalyDetectionIndices).doesDefaultAnomalyResultIndexExist(); + doReturn(true).when(anomalyDetectionIndices).doesDefaultResultIndexExist(); doAnswer(invocation -> { ActionListener listener = invocation.getArgument(1); BulkItemResponse[] bulkItemResponses = new BulkItemResponse[2]; @@ -176,7 +175,7 @@ public void testWrongAnomalyResult() { public void testBulkSaveException() { BulkRequestBuilder bulkRequestBuilder = mock(BulkRequestBuilder.class); doReturn(bulkRequestBuilder).when(client).prepareBulk(); - doReturn(true).when(anomalyDetectionIndices).doesDefaultAnomalyResultIndexExist(); + doReturn(true).when(anomalyDetectionIndices).doesDefaultResultIndexExist(); String testError = randomAlphaOfLength(5); doAnswer(invocation -> { @@ -203,7 +202,7 @@ private AnomalyResult wrongAnomalyResult() { null, null, randomAlphaOfLength(5), - null, + Optional.empty(), null, null, null, diff --git a/src/test/java/org/opensearch/ad/transport/handler/AnomalyResultHandlerTests.java b/src/test/java/org/opensearch/ad/transport/handler/AnomalyResultHandlerTests.java index 5eb078e24..b17008e1d 100644 --- a/src/test/java/org/opensearch/ad/transport/handler/AnomalyResultHandlerTests.java +++ b/src/test/java/org/opensearch/ad/transport/handler/AnomalyResultHandlerTests.java @@ -33,15 +33,15 @@ import org.mockito.Mock; import org.opensearch.action.index.IndexRequest; import org.opensearch.action.index.IndexResponse; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.common.exception.AnomalyDetectionException; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.common.settings.Settings; import org.opensearch.common.unit.TimeValue; import org.opensearch.core.action.ActionListener; import org.opensearch.core.concurrency.OpenSearchRejectedExecutionException; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.TimeSeriesException; public class AnomalyResultHandlerTests extends AbstractIndexHandlerTest { @Mock @@ -85,7 +85,7 @@ public void testSavingAdResult() throws IOException { client, settings, threadPool, - CommonName.ANOMALY_RESULT_INDEX_ALIAS, + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, anomalyDetectionIndices, clientUtil, indexUtil, @@ -121,7 +121,7 @@ public void testIndexWriteBlock() { client, settings, threadPool, - CommonName.ANOMALY_RESULT_INDEX_ALIAS, + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, anomalyDetectionIndices, clientUtil, indexUtil, @@ -139,7 +139,7 @@ public void testAdResultIndexExist() throws IOException { client, settings, threadPool, - CommonName.ANOMALY_RESULT_INDEX_ALIAS, + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, anomalyDetectionIndices, clientUtil, indexUtil, @@ -151,7 +151,7 @@ public void testAdResultIndexExist() throws IOException { @Test public void testAdResultIndexOtherException() throws IOException { - expectedEx.expect(AnomalyDetectionException.class); + expectedEx.expect(TimeSeriesException.class); expectedEx.expectMessage("Error in saving .opendistro-anomaly-results for detector " + detectorId); setUpSavingAnomalyResultIndex(false, IndexCreation.RUNTIME_EXCEPTION); @@ -159,7 +159,7 @@ public void testAdResultIndexOtherException() throws IOException { client, settings, threadPool, - CommonName.ANOMALY_RESULT_INDEX_ALIAS, + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, anomalyDetectionIndices, clientUtil, indexUtil, @@ -217,7 +217,7 @@ private void savingFailureTemplate(boolean throwOpenSearchRejectedExecutionExcep client, backoffSettings, threadPool, - CommonName.ANOMALY_RESULT_INDEX_ALIAS, + ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, anomalyDetectionIndices, clientUtil, indexUtil, diff --git a/src/test/java/org/opensearch/ad/transport/handler/MultiEntityResultHandlerTests.java b/src/test/java/org/opensearch/ad/transport/handler/MultiEntityResultHandlerTests.java index bab46188c..f6483c8b7 100644 --- a/src/test/java/org/opensearch/ad/transport/handler/MultiEntityResultHandlerTests.java +++ b/src/test/java/org/opensearch/ad/transport/handler/MultiEntityResultHandlerTests.java @@ -23,14 +23,14 @@ import org.junit.Test; import org.mockito.ArgumentMatchers; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.common.exception.AnomalyDetectionException; import org.opensearch.ad.ratelimit.RequestPriority; import org.opensearch.ad.ratelimit.ResultWriteRequest; import org.opensearch.ad.transport.ADResultBulkAction; import org.opensearch.ad.transport.ADResultBulkRequest; import org.opensearch.ad.transport.ADResultBulkResponse; import org.opensearch.core.action.ActionListener; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.TimeSeriesException; public class MultiEntityResultHandlerTests extends AbstractIndexHandlerTest { private MultiEntityResultHandler handler; @@ -88,7 +88,7 @@ public void testIndexWriteBlock() throws InterruptedException { assertTrue("Should not reach here ", false); verified.countDown(); }, exception -> { - assertTrue(exception instanceof AnomalyDetectionException); + assertTrue(exception instanceof TimeSeriesException); assertTrue( "actual: " + exception.getMessage(), exception.getMessage().contains(MultiEntityResultHandler.CANNOT_SAVE_RESULT_ERR_MSG) @@ -154,7 +154,7 @@ public void testNothingToSave() throws IOException, InterruptedException { assertTrue("Should not reach here ", false); verified.countDown(); }, exception -> { - assertTrue(exception instanceof AnomalyDetectionException); + assertTrue(exception instanceof TimeSeriesException); verified.countDown(); })); assertTrue(verified.await(100, TimeUnit.SECONDS)); @@ -169,7 +169,7 @@ public void testCreateUnAcked() throws IOException, InterruptedException { assertTrue("Should not reach here ", false); verified.countDown(); }, exception -> { - assertTrue(exception instanceof AnomalyDetectionException); + assertTrue(exception instanceof TimeSeriesException); verified.countDown(); })); assertTrue(verified.await(100, TimeUnit.SECONDS)); diff --git a/src/test/java/org/opensearch/ad/util/ExceptionUtilsTests.java b/src/test/java/org/opensearch/ad/util/ExceptionUtilsTests.java index 794cb9e5f..3a9ff1047 100644 --- a/src/test/java/org/opensearch/ad/util/ExceptionUtilsTests.java +++ b/src/test/java/org/opensearch/ad/util/ExceptionUtilsTests.java @@ -14,10 +14,11 @@ import org.opensearch.OpenSearchException; import org.opensearch.action.index.IndexResponse; import org.opensearch.action.support.replication.ReplicationResponse; -import org.opensearch.ad.common.exception.AnomalyDetectionException; import org.opensearch.core.index.shard.ShardId; import org.opensearch.core.rest.RestStatus; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.util.ExceptionUtil; public class ExceptionUtilsTests extends OpenSearchTestCase { @@ -48,13 +49,13 @@ public void testGetShardsFailureWithoutError() { } public void testCountInStats() { - assertTrue(ExceptionUtil.countInStats(new AnomalyDetectionException("test"))); - assertFalse(ExceptionUtil.countInStats(new AnomalyDetectionException("test").countedInStats(false))); + assertTrue(ExceptionUtil.countInStats(new TimeSeriesException("test"))); + assertFalse(ExceptionUtil.countInStats(new TimeSeriesException("test").countedInStats(false))); assertTrue(ExceptionUtil.countInStats(new RuntimeException("test"))); } public void testGetErrorMessage() { - assertEquals("test", ExceptionUtil.getErrorMessage(new AnomalyDetectionException("test"))); + assertEquals("test", ExceptionUtil.getErrorMessage(new TimeSeriesException("test"))); assertEquals("test", ExceptionUtil.getErrorMessage(new IllegalArgumentException("test"))); assertEquals("OpenSearchException[test]", ExceptionUtil.getErrorMessage(new OpenSearchException("test"))); assertTrue( diff --git a/src/test/java/org/opensearch/ad/util/IndexUtilsTests.java b/src/test/java/org/opensearch/ad/util/IndexUtilsTests.java index b57459c7a..7234f6feb 100644 --- a/src/test/java/org/opensearch/ad/util/IndexUtilsTests.java +++ b/src/test/java/org/opensearch/ad/util/IndexUtilsTests.java @@ -13,17 +13,13 @@ import static org.mockito.Mockito.mock; -import java.time.Clock; - import org.junit.Before; import org.junit.Test; import org.opensearch.action.support.master.AcknowledgedResponse; -import org.opensearch.ad.TestHelpers; import org.opensearch.client.Client; import org.opensearch.cluster.metadata.IndexNameExpressionResolver; -import org.opensearch.common.settings.Settings; import org.opensearch.test.OpenSearchIntegTestCase; -import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.util.ClientUtil; public class IndexUtilsTests extends OpenSearchIntegTestCase { @@ -34,10 +30,7 @@ public class IndexUtilsTests extends OpenSearchIntegTestCase { @Before public void setup() { Client client = client(); - Clock clock = mock(Clock.class); - Throttler throttler = new Throttler(clock); - ThreadPool context = TestHelpers.createThreadPool(); - clientUtil = new ClientUtil(Settings.EMPTY, client, throttler, context); + clientUtil = new ClientUtil(client); indexNameResolver = mock(IndexNameExpressionResolver.class); } @@ -70,25 +63,4 @@ public void testGetIndexHealth_Alias() { String status = indexUtils.getIndexHealthStatus(aliasName); assertTrue(status.equals("green") || status.equals("yellow")); } - - @Test - public void testGetNumberOfDocumentsInIndex_NonExistentIndex() { - IndexUtils indexUtils = new IndexUtils(client(), clientUtil, clusterService(), indexNameResolver); - assertEquals((Long) 0L, indexUtils.getNumberOfDocumentsInIndex("index")); - } - - @Test - public void testGetNumberOfDocumentsInIndex_RegularIndex() { - String indexName = "test-2"; - createIndex(indexName); - flush(); - - long count = 2100; - for (int i = 0; i < count; i++) { - index(indexName, "_doc", String.valueOf(i), "{}"); - } - flushAndRefresh(indexName); - IndexUtils indexUtils = new IndexUtils(client(), clientUtil, clusterService(), indexNameResolver); - assertEquals((Long) count, indexUtils.getNumberOfDocumentsInIndex(indexName)); - } } diff --git a/src/test/java/org/opensearch/ad/util/ParseUtilsTests.java b/src/test/java/org/opensearch/ad/util/ParseUtilsTests.java index fea717a53..c2dd673b4 100644 --- a/src/test/java/org/opensearch/ad/util/ParseUtilsTests.java +++ b/src/test/java/org/opensearch/ad/util/ParseUtilsTests.java @@ -11,18 +11,15 @@ package org.opensearch.ad.util; -import static org.opensearch.ad.util.ParseUtils.addUserBackendRolesFilter; -import static org.opensearch.ad.util.ParseUtils.isAdmin; +import static org.opensearch.timeseries.util.ParseUtils.addUserBackendRolesFilter; +import static org.opensearch.timeseries.util.ParseUtils.isAdmin; import java.io.IOException; import java.time.Instant; import java.time.temporal.ChronoUnit; import java.util.List; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.common.exception.AnomalyDetectionException; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Feature; import org.opensearch.common.xcontent.XContentFactory; import org.opensearch.commons.authuser.User; import org.opensearch.core.common.ParsingException; @@ -32,6 +29,10 @@ import org.opensearch.search.aggregations.AggregatorFactories; import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.TimeSeriesException; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.util.ParseUtils; import com.google.common.collect.ImmutableList; @@ -124,14 +125,6 @@ public void testGenerateInternalFeatureQuery() throws IOException { } } - public void testGenerateInternalFeatureQueryTemplate() throws IOException { - AnomalyDetector detector = TestHelpers.randomAnomalyDetector(null, Instant.now()); - String builder = ParseUtils.generateInternalFeatureQueryTemplate(detector, TestHelpers.xContentRegistry()); - for (Feature feature : detector.getFeatureAttributes()) { - assertTrue(builder.contains(feature.getId())); - } - } - public void testAddUserRoleFilterWithNullUser() { SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder(); addUserBackendRolesFilter(null, searchSourceBuilder); @@ -242,8 +235,8 @@ public void testBatchFeatureQueryWithoutEnabledFeature() throws IOException { long startTime = now.minus(10, ChronoUnit.DAYS).toEpochMilli(); long endTime = now.plus(10, ChronoUnit.DAYS).toEpochMilli(); - AnomalyDetectionException exception = expectThrows( - AnomalyDetectionException.class, + TimeSeriesException exception = expectThrows( + TimeSeriesException.class, () -> ParseUtils.batchFeatureQuery(detector, null, startTime, endTime, TestHelpers.xContentRegistry()) ); assertEquals("No enabled feature configured", exception.getMessage()); @@ -257,8 +250,8 @@ public void testBatchFeatureQueryWithoutFeature() throws IOException { long startTime = now.minus(10, ChronoUnit.DAYS).toEpochMilli(); long endTime = now.plus(10, ChronoUnit.DAYS).toEpochMilli(); - AnomalyDetectionException exception = expectThrows( - AnomalyDetectionException.class, + TimeSeriesException exception = expectThrows( + TimeSeriesException.class, () -> ParseUtils.batchFeatureQuery(detector, null, startTime, endTime, TestHelpers.xContentRegistry()) ); assertEquals("No enabled feature configured", exception.getMessage()); diff --git a/src/test/java/org/opensearch/ad/util/RestHandlerUtilsTests.java b/src/test/java/org/opensearch/ad/util/RestHandlerUtilsTests.java index 708e3b862..ecd60e5d4 100644 --- a/src/test/java/org/opensearch/ad/util/RestHandlerUtilsTests.java +++ b/src/test/java/org/opensearch/ad/util/RestHandlerUtilsTests.java @@ -11,13 +11,12 @@ package org.opensearch.ad.util; -import static org.opensearch.ad.TestHelpers.builder; -import static org.opensearch.ad.TestHelpers.randomFeature; -import static org.opensearch.ad.util.RestHandlerUtils.OPENSEARCH_DASHBOARDS_USER_AGENT; +import static org.opensearch.timeseries.TestHelpers.builder; +import static org.opensearch.timeseries.TestHelpers.randomFeature; +import static org.opensearch.timeseries.util.RestHandlerUtils.OPENSEARCH_DASHBOARDS_USER_AGENT; import java.io.IOException; -import org.opensearch.ad.TestHelpers; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.core.common.bytes.BytesReference; import org.opensearch.core.xcontent.NamedXContentRegistry; @@ -30,6 +29,8 @@ import org.opensearch.test.OpenSearchTestCase; import org.opensearch.test.rest.FakeRestChannel; import org.opensearch.test.rest.FakeRestRequest; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.util.RestHandlerUtils; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -86,8 +87,8 @@ public void testisExceptionCausedByInvalidQueryNotSearchPhaseException() { public void testValidateAnomalyDetectorWithTooManyFeatures() throws IOException { AnomalyDetector detector = TestHelpers.randomAnomalyDetector(ImmutableList.of(randomFeature(), randomFeature())); - String error = RestHandlerUtils.checkAnomalyDetectorFeaturesSyntax(detector, 1); - assertEquals("Can't create more than 1 anomaly features", error); + String error = RestHandlerUtils.checkFeaturesSyntax(detector, 1); + assertEquals("Can't create more than 1 features", error); } public void testValidateAnomalyDetectorWithDuplicateFeatureNames() throws IOException { @@ -96,8 +97,8 @@ public void testValidateAnomalyDetectorWithDuplicateFeatureNames() throws IOExce .randomAnomalyDetector( ImmutableList.of(randomFeature(featureName, randomAlphaOfLength(5)), randomFeature(featureName, randomAlphaOfLength(5))) ); - String error = RestHandlerUtils.checkAnomalyDetectorFeaturesSyntax(detector, 2); - assertEquals("Detector has duplicate feature names: " + featureName, error); + String error = RestHandlerUtils.checkFeaturesSyntax(detector, 2); + assertEquals("There are duplicate feature names: " + featureName, error); } public void testValidateAnomalyDetectorWithDuplicateAggregationNames() throws IOException { @@ -107,7 +108,7 @@ public void testValidateAnomalyDetectorWithDuplicateAggregationNames() throws IO ImmutableList .of(randomFeature(randomAlphaOfLength(5), aggregationName), randomFeature(randomAlphaOfLength(5), aggregationName)) ); - String error = RestHandlerUtils.checkAnomalyDetectorFeaturesSyntax(detector, 2); - assertEquals("Detector has duplicate feature aggregation query names: " + aggregationName, error); + String error = RestHandlerUtils.checkFeaturesSyntax(detector, 2); + assertEquals("Config has duplicate feature aggregation query names: " + aggregationName, error); } } diff --git a/src/test/java/org/opensearch/ad/util/ThrottlerTests.java b/src/test/java/org/opensearch/ad/util/ThrottlerTests.java deleted file mode 100644 index 63d060b28..000000000 --- a/src/test/java/org/opensearch/ad/util/ThrottlerTests.java +++ /dev/null @@ -1,67 +0,0 @@ -/* - * SPDX-License-Identifier: Apache-2.0 - * - * The OpenSearch Contributors require contributions made to - * this file be licensed under the Apache-2.0 license or a - * compatible open source license. - * - * Modifications Copyright OpenSearch Contributors. See - * GitHub history for details. - */ - -package org.opensearch.ad.util; - -import static org.mockito.Mockito.mock; -import static org.mockito.Mockito.when; - -import java.time.Clock; - -import org.junit.Before; -import org.junit.Test; -import org.opensearch.action.search.SearchRequest; -import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.test.OpenSearchTestCase; - -public class ThrottlerTests extends OpenSearchTestCase { - private Throttler throttler; - - @Before - public void setup() { - Clock clock = mock(Clock.class); - this.throttler = new Throttler(clock); - } - - @Test - public void testGetFilteredQuery() { - AnomalyDetector detector = mock(AnomalyDetector.class); - when(detector.getDetectorId()).thenReturn("test detector Id"); - SearchRequest dummySearchRequest = new SearchRequest(); - throttler.insertFilteredQuery(detector.getDetectorId(), dummySearchRequest); - // case 1: key exists - assertTrue(throttler.getFilteredQuery(detector.getDetectorId()).isPresent()); - // case 2: key doesn't exist - assertFalse(throttler.getFilteredQuery("different test detector Id").isPresent()); - } - - @Test - public void testInsertFilteredQuery() { - AnomalyDetector detector = mock(AnomalyDetector.class); - when(detector.getDetectorId()).thenReturn("test detector Id"); - SearchRequest dummySearchRequest = new SearchRequest(); - // first time: key doesn't exist - assertTrue(throttler.insertFilteredQuery(detector.getDetectorId(), dummySearchRequest)); - // second time: key exists - assertFalse(throttler.insertFilteredQuery(detector.getDetectorId(), dummySearchRequest)); - } - - @Test - public void testClearFilteredQuery() { - AnomalyDetector detector = mock(AnomalyDetector.class); - when(detector.getDetectorId()).thenReturn("test detector Id"); - SearchRequest dummySearchRequest = new SearchRequest(); - assertTrue(throttler.insertFilteredQuery(detector.getDetectorId(), dummySearchRequest)); - throttler.clearFilteredQuery(detector.getDetectorId()); - assertTrue(throttler.insertFilteredQuery(detector.getDetectorId(), dummySearchRequest)); - } - -} diff --git a/src/test/java/org/opensearch/ad/util/ThrowingSupplierWrapperTests.java b/src/test/java/org/opensearch/ad/util/ThrowingSupplierWrapperTests.java index 3dbe7eb58..f7db4a278 100644 --- a/src/test/java/org/opensearch/ad/util/ThrowingSupplierWrapperTests.java +++ b/src/test/java/org/opensearch/ad/util/ThrowingSupplierWrapperTests.java @@ -14,6 +14,7 @@ import java.io.IOException; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.function.ThrowingSupplierWrapper; public class ThrowingSupplierWrapperTests extends OpenSearchTestCase { private static String foo() throws IOException { diff --git a/src/test/java/org/opensearch/forecast/indices/ForecastIndexManagementTests.java b/src/test/java/org/opensearch/forecast/indices/ForecastIndexManagementTests.java new file mode 100644 index 000000000..cdc19a4ac --- /dev/null +++ b/src/test/java/org/opensearch/forecast/indices/ForecastIndexManagementTests.java @@ -0,0 +1,338 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.indices; + +import static org.hamcrest.Matchers.equalTo; +import static org.hamcrest.Matchers.is; +import static org.hamcrest.Matchers.notNullValue; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.doAnswer; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.never; +import static org.mockito.Mockito.verify; + +import java.io.IOException; +import java.util.Collection; +import java.util.Collections; +import java.util.Locale; +import java.util.concurrent.CountDownLatch; +import java.util.concurrent.TimeUnit; + +import org.hamcrest.MatcherAssert; +import org.junit.Before; +import org.opensearch.action.admin.indices.alias.get.GetAliasesResponse; +import org.opensearch.action.admin.indices.get.GetIndexResponse; +import org.opensearch.common.settings.Settings; +import org.opensearch.common.unit.TimeValue; +import org.opensearch.core.action.ActionListener; +import org.opensearch.index.IndexNotFoundException; +import org.opensearch.plugins.Plugin; +import org.opensearch.test.OpenSearchIntegTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.indices.IndexManagementIntegTestCase; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; + +@OpenSearchIntegTestCase.ClusterScope(scope = OpenSearchIntegTestCase.Scope.TEST, numDataNodes = 0, numClientNodes = 0, supportsDedicatedMasters = false) +public class ForecastIndexManagementTests extends IndexManagementIntegTestCase { + private ForecastIndexManagement indices; + private Settings settings; + private DiscoveryNodeFilterer nodeFilter; + + @Override + protected boolean ignoreExternalCluster() { + return true; + } + + // help register setting using TimeSeriesAnalyticsPlugin.getSettings. + // Otherwise, ForecastIndexManagement's constructor would fail due to + // unregistered settings like FORECAST_RESULT_HISTORY_MAX_DOCS_PER_SHARD. + @Override + protected Collection> nodePlugins() { + return Collections.singletonList(TimeSeriesAnalyticsPlugin.class); + } + + @Before + public void setup() throws IOException { + settings = Settings + .builder() + .put("plugins.forecast.forecast_result_history_rollover_period", TimeValue.timeValueHours(12)) + .put("plugins.forecast.forecast_result_history_retention_period", TimeValue.timeValueHours(24)) + .put("plugins.forecast.forecast_result_history_max_docs", 10000L) + .put("plugins.forecast.request_timeout", TimeValue.timeValueSeconds(10)) + .build(); + + internalCluster().ensureAtLeastNumDataNodes(1); + ensureStableCluster(1); + + nodeFilter = new DiscoveryNodeFilterer(clusterService()); + + indices = new ForecastIndexManagement( + client(), + clusterService(), + client().threadPool(), + settings, + nodeFilter, + TimeSeriesSettings.MAX_UPDATE_RETRY_TIMES + ); + } + + public void testForecastResultIndexNotExists() { + boolean exists = indices.doesDefaultResultIndexExist(); + assertFalse(exists); + } + + public void testForecastResultIndexExists() throws IOException { + indices.initDefaultResultIndexIfAbsent(TestHelpers.createActionListener(response -> { + boolean acknowledged = response.isAcknowledged(); + assertTrue(acknowledged); + }, failure -> { throw new RuntimeException("should not recreate index"); })); + TestHelpers.waitForIndexCreationToComplete(client(), ForecastIndex.RESULT.getIndexName()); + assertTrue(indices.doesDefaultResultIndexExist()); + } + + public void testForecastResultIndexExistsAndNotRecreate() throws IOException { + indices + .initDefaultResultIndexIfAbsent( + TestHelpers.createActionListener(response -> logger.info("Acknowledged: " + response.isAcknowledged()), failure -> { + throw new RuntimeException("should not recreate index"); + }) + ); + TestHelpers.waitForIndexCreationToComplete(client(), ForecastIndex.RESULT.getIndexName()); + if (client().admin().indices().prepareExists(ForecastIndex.RESULT.getIndexName()).get().isExists()) { + indices.initDefaultResultIndexIfAbsent(TestHelpers.createActionListener(response -> { + throw new RuntimeException("should not recreate index " + ForecastIndex.RESULT.getIndexName()); + }, failure -> { throw new RuntimeException("should not recreate index " + ForecastIndex.RESULT.getIndexName(), failure); })); + } + } + + public void testCheckpointIndexNotExists() { + boolean exists = indices.doesCheckpointIndexExist(); + assertFalse(exists); + } + + public void testCheckpointIndexExists() throws IOException { + indices.initCheckpointIndex(TestHelpers.createActionListener(response -> { + boolean acknowledged = response.isAcknowledged(); + assertTrue(acknowledged); + }, failure -> { throw new RuntimeException("should not recreate index"); })); + TestHelpers.waitForIndexCreationToComplete(client(), ForecastIndex.STATE.getIndexName()); + assertTrue(indices.doesCheckpointIndexExist()); + } + + public void testStateIndexNotExists() { + boolean exists = indices.doesStateIndexExist(); + assertFalse(exists); + } + + public void testStateIndexExists() throws IOException { + indices.initStateIndex(TestHelpers.createActionListener(response -> { + boolean acknowledged = response.isAcknowledged(); + assertTrue(acknowledged); + }, failure -> { throw new RuntimeException("should not recreate index"); })); + TestHelpers.waitForIndexCreationToComplete(client(), ForecastIndex.STATE.getIndexName()); + assertTrue(indices.doesStateIndexExist()); + } + + public void testConfigIndexNotExists() { + boolean exists = indices.doesConfigIndexExist(); + assertFalse(exists); + } + + public void testConfigIndexExists() throws IOException { + indices.initConfigIndex(TestHelpers.createActionListener(response -> { + boolean acknowledged = response.isAcknowledged(); + assertTrue(acknowledged); + }, failure -> { throw new RuntimeException("should not recreate index"); })); + TestHelpers.waitForIndexCreationToComplete(client(), ForecastIndex.CONFIG.getIndexName()); + assertTrue(indices.doesConfigIndexExist()); + } + + public void testCustomResultIndexExists() throws IOException { + String indexName = "a"; + assertTrue(!(client().admin().indices().prepareExists(indexName).get().isExists())); + indices + .initCustomResultIndexDirectly( + indexName, + TestHelpers.createActionListener(response -> logger.info("Acknowledged: " + response.isAcknowledged()), failure -> { + throw new RuntimeException("should not recreate index"); + }) + ); + TestHelpers.waitForIndexCreationToComplete(client(), indexName); + assertTrue((client().admin().indices().prepareExists(indexName).get().isExists())); + } + + public void testJobIndexNotExists() { + boolean exists = indices.doesJobIndexExist(); + assertFalse(exists); + } + + public void testJobIndexExists() throws IOException { + indices.initJobIndex(TestHelpers.createActionListener(response -> { + boolean acknowledged = response.isAcknowledged(); + assertTrue(acknowledged); + }, failure -> { throw new RuntimeException("should not recreate index"); })); + TestHelpers.waitForIndexCreationToComplete(client(), ForecastIndex.JOB.getIndexName()); + assertTrue(indices.doesJobIndexExist()); + } + + public void testValidateCustomIndexForBackendJobNoIndex() { + validateCustomIndexForBackendJobNoIndex(indices); + } + + public void testValidateCustomIndexForBackendJobInvalidMapping() { + validateCustomIndexForBackendJobInvalidMapping(indices); + } + + public void testValidateCustomIndexForBackendJob() throws IOException, InterruptedException { + validateCustomIndexForBackendJob(indices, ForecastIndexManagement.getResultMappings()); + } + + public void testRollOver() throws IOException, InterruptedException { + indices.initDefaultResultIndexIfAbsent(TestHelpers.createActionListener(response -> { + boolean acknowledged = response.isAcknowledged(); + assertTrue(acknowledged); + }, failure -> { throw new RuntimeException("should not recreate index"); })); + TestHelpers.waitForIndexCreationToComplete(client(), ForecastIndex.RESULT.getIndexName()); + client().index(indices.createDummyIndexRequest(ForecastIndex.RESULT.getIndexName())).actionGet(); + + GetAliasesResponse getAliasesResponse = admin().indices().prepareGetAliases(ForecastIndex.RESULT.getIndexName()).get(); + String oldIndex = getAliasesResponse.getAliases().keySet().iterator().next(); + + settings = Settings + .builder() + .put("plugins.forecast.forecast_result_history_rollover_period", TimeValue.timeValueHours(12)) + .put("plugins.forecast.forecast_result_history_retention_period", TimeValue.timeValueHours(0)) + .put("plugins.forecast.forecast_result_history_max_docs", 0L) + .put("plugins.forecast.forecast_result_history_max_docs_per_shard", 0L) + .put("plugins.forecast.request_timeout", TimeValue.timeValueSeconds(10)) + .build(); + + nodeFilter = new DiscoveryNodeFilterer(clusterService()); + + indices = new ForecastIndexManagement( + client(), + clusterService(), + client().threadPool(), + settings, + nodeFilter, + TimeSeriesSettings.MAX_UPDATE_RETRY_TIMES + ); + indices.rolloverAndDeleteHistoryIndex(); + + // replace the last two characters "-1" to "000002"? + // Example: + // Input: opensearch-forecast-results-history-2023.06.15-1 + // Output: opensearch-forecast-results-history-2023.06.15-000002 + String newIndex = oldIndex.replaceFirst("-1$", "-000002"); + TestHelpers.waitForIndexCreationToComplete(client(), newIndex); + + getAliasesResponse = admin().indices().prepareGetAliases(ForecastIndex.RESULT.getIndexName()).get(); + String currentPointedIndex = getAliasesResponse.getAliases().keySet().iterator().next(); + assertEquals(newIndex, currentPointedIndex); + + client().index(indices.createDummyIndexRequest(ForecastIndex.RESULT.getIndexName())).actionGet(); + // now we have two indices + indices.rolloverAndDeleteHistoryIndex(); + + String thirdIndexName = getIncrementedIndex(newIndex); + TestHelpers.waitForIndexCreationToComplete(client(), thirdIndexName); + getAliasesResponse = admin().indices().prepareGetAliases(ForecastIndex.RESULT.getIndexName()).get(); + currentPointedIndex = getAliasesResponse.getAliases().keySet().iterator().next(); + assertEquals(thirdIndexName, currentPointedIndex); + + // we have already deleted the oldest index since retention period is 0 hrs + int retry = 0; + while (retry < 10) { + try { + client().admin().indices().prepareGetIndex().addIndices(oldIndex).get(); + retry++; + // wait for index to be deleted + Thread.sleep(1000); + } catch (IndexNotFoundException e) { + MatcherAssert.assertThat(e.getMessage(), is(String.format(Locale.ROOT, "no such index [%s]", oldIndex))); + break; + } + } + + assertTrue(retry < 20); + + // 2nd oldest index should be fine as we keep at one old index + GetIndexResponse response = client().admin().indices().prepareGetIndex().addIndices(newIndex).get(); + String[] indicesInResponse = response.indices(); + MatcherAssert.assertThat(indicesInResponse, notNullValue()); + MatcherAssert.assertThat(indicesInResponse.length, equalTo(1)); + MatcherAssert.assertThat(indicesInResponse[0], equalTo(newIndex)); + + response = client().admin().indices().prepareGetIndex().addIndices(thirdIndexName).get(); + indicesInResponse = response.indices(); + MatcherAssert.assertThat(indicesInResponse, notNullValue()); + MatcherAssert.assertThat(indicesInResponse.length, equalTo(1)); + MatcherAssert.assertThat(indicesInResponse[0], equalTo(thirdIndexName)); + } + + /** + * Increment the last digit oif an index name. + * @param input. Example: opensearch-forecast-results-history-2023.06.15-000002 + * @return Example: opensearch-forecast-results-history-2023.06.15-000003 + */ + private String getIncrementedIndex(String input) { + int lastDash = input.lastIndexOf('-'); + + String prefix = input.substring(0, lastDash + 1); + String numberPart = input.substring(lastDash + 1); + + // Increment the number part + int incrementedNumber = Integer.parseInt(numberPart) + 1; + + // Use String.format to keep the leading zeros + String newNumberPart = String.format(Locale.ROOT, "%06d", incrementedNumber); + + return prefix + newNumberPart; + } + + public void testInitCustomResultIndexAndExecuteIndexNotExist() throws InterruptedException { + String resultIndex = "abc"; + ExecutorFunction function = mock(ExecutorFunction.class); + ActionListener listener = mock(ActionListener.class); + + CountDownLatch latch = new CountDownLatch(1); + doAnswer(invocation -> { + latch.countDown(); + return null; + }).when(function).execute(); + + indices.initCustomResultIndexAndExecute(resultIndex, function, listener); + latch.await(20, TimeUnit.SECONDS); + verify(listener, never()).onFailure(any(Exception.class)); + } + + public void testInitCustomResultIndexAndExecuteIndex() throws InterruptedException, IOException { + String indexName = "abc"; + ExecutorFunction function = mock(ExecutorFunction.class); + ActionListener listener = mock(ActionListener.class); + + indices + .initCustomResultIndexDirectly( + indexName, + TestHelpers.createActionListener(response -> logger.info("Acknowledged: " + response.isAcknowledged()), failure -> { + throw new RuntimeException("should not recreate index"); + }) + ); + TestHelpers.waitForIndexCreationToComplete(client(), indexName); + CountDownLatch latch = new CountDownLatch(1); + doAnswer(invocation -> { + latch.countDown(); + return null; + }).when(function).execute(); + + indices.initCustomResultIndexAndExecute(indexName, function, listener); + latch.await(20, TimeUnit.SECONDS); + verify(listener, never()).onFailure(any(Exception.class)); + } +} diff --git a/src/test/java/org/opensearch/forecast/indices/ForecastIndexMappingTests.java b/src/test/java/org/opensearch/forecast/indices/ForecastIndexMappingTests.java new file mode 100644 index 000000000..a79eda373 --- /dev/null +++ b/src/test/java/org/opensearch/forecast/indices/ForecastIndexMappingTests.java @@ -0,0 +1,87 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.indices; + +import java.io.IOException; + +import org.opensearch.test.OpenSearchTestCase; + +import com.fasterxml.jackson.databind.JsonNode; +import com.fasterxml.jackson.databind.ObjectMapper; + +public class ForecastIndexMappingTests extends OpenSearchTestCase { + + public void testGetForecastResultMappings() throws IOException { + String mapping = ForecastIndexManagement.getResultMappings(); + + // Use Jackson to convert the string into a JsonNode + ObjectMapper mapper = new ObjectMapper(); + JsonNode mappingJson = mapper.readTree(mapping); + + // Check the existence of some fields + assertTrue("forecaster_id field is missing", mappingJson.path("properties").has("forecaster_id")); + assertTrue("feature_data field is missing", mappingJson.path("properties").has("feature_data")); + assertTrue("data_start_time field is missing", mappingJson.path("properties").has("data_start_time")); + assertTrue("execution_start_time field is missing", mappingJson.path("properties").has("execution_start_time")); + assertTrue("user field is missing", mappingJson.path("properties").has("user")); + assertTrue("entity field is missing", mappingJson.path("properties").has("entity")); + assertTrue("schema_version field is missing", mappingJson.path("properties").has("schema_version")); + assertTrue("task_id field is missing", mappingJson.path("properties").has("task_id")); + assertTrue("model_id field is missing", mappingJson.path("properties").has("model_id")); + assertTrue("forecast_series field is missing", mappingJson.path("properties").has("forecast_value")); + } + + public void testGetCheckpointMappings() throws IOException { + String mapping = ForecastIndexManagement.getCheckpointMappings(); + + // Use Jackson to convert the string into a JsonNode + ObjectMapper mapper = new ObjectMapper(); + JsonNode mappingJson = mapper.readTree(mapping); + + // Check the existence of some fields + assertTrue("forecaster_id field is missing", mappingJson.path("properties").has("forecaster_id")); + assertTrue("timestamp field is missing", mappingJson.path("properties").has("timestamp")); + assertTrue("schema_version field is missing", mappingJson.path("properties").has("schema_version")); + assertTrue("entity field is missing", mappingJson.path("properties").has("entity")); + assertTrue("model field is missing", mappingJson.path("properties").has("model")); + assertTrue("samples field is missing", mappingJson.path("properties").has("samples")); + assertTrue("last_processed_sample field is missing", mappingJson.path("properties").has("last_processed_sample")); + } + + public void testGetStateMappings() throws IOException { + String mapping = ForecastIndexManagement.getStateMappings(); + + // Use Jackson to convert the string into a JsonNode + ObjectMapper mapper = new ObjectMapper(); + JsonNode mappingJson = mapper.readTree(mapping); + + // Check the existence of some fields + assertTrue("schema_version field is missing", mappingJson.path("properties").has("schema_version")); + assertTrue("last_update_time field is missing", mappingJson.path("properties").has("last_update_time")); + assertTrue("error field is missing", mappingJson.path("properties").has("error")); + assertTrue("started_by field is missing", mappingJson.path("properties").has("started_by")); + assertTrue("stopped_by field is missing", mappingJson.path("properties").has("stopped_by")); + assertTrue("forecaster_id field is missing", mappingJson.path("properties").has("forecaster_id")); + assertTrue("state field is missing", mappingJson.path("properties").has("state")); + assertTrue("task_progress field is missing", mappingJson.path("properties").has("task_progress")); + assertTrue("init_progress field is missing", mappingJson.path("properties").has("init_progress")); + assertTrue("current_piece field is missing", mappingJson.path("properties").has("current_piece")); + assertTrue("execution_start_time field is missing", mappingJson.path("properties").has("execution_start_time")); + assertTrue("execution_end_time field is missing", mappingJson.path("properties").has("execution_end_time")); + assertTrue("is_latest field is missing", mappingJson.path("properties").has("is_latest")); + assertTrue("task_type field is missing", mappingJson.path("properties").has("task_type")); + assertTrue("checkpoint_id field is missing", mappingJson.path("properties").has("checkpoint_id")); + assertTrue("coordinating_node field is missing", mappingJson.path("properties").has("coordinating_node")); + assertTrue("worker_node field is missing", mappingJson.path("properties").has("worker_node")); + assertTrue("user field is missing", mappingJson.path("properties").has("user")); + assertTrue("forecaster field is missing", mappingJson.path("properties").has("forecaster")); + assertTrue("date_range field is missing", mappingJson.path("properties").has("date_range")); + assertTrue("parent_task_id field is missing", mappingJson.path("properties").has("parent_task_id")); + assertTrue("entity field is missing", mappingJson.path("properties").has("entity")); + assertTrue("estimated_minutes_left field is missing", mappingJson.path("properties").has("estimated_minutes_left")); + } + +} diff --git a/src/test/java/org/opensearch/forecast/indices/ForecastResultIndexTests.java b/src/test/java/org/opensearch/forecast/indices/ForecastResultIndexTests.java new file mode 100644 index 000000000..8cbe42d00 --- /dev/null +++ b/src/test/java/org/opensearch/forecast/indices/ForecastResultIndexTests.java @@ -0,0 +1,229 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.indices; + +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.doAnswer; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.never; +import static org.mockito.Mockito.times; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; + +import java.io.IOException; +import java.util.Arrays; +import java.util.Collections; +import java.util.HashMap; +import java.util.HashSet; +import java.util.Map; + +import org.mockito.ArgumentCaptor; +import org.opensearch.ResourceAlreadyExistsException; +import org.opensearch.Version; +import org.opensearch.action.admin.cluster.state.ClusterStateRequest; +import org.opensearch.action.admin.cluster.state.ClusterStateResponse; +import org.opensearch.action.admin.indices.create.CreateIndexResponse; +import org.opensearch.client.AdminClient; +import org.opensearch.client.Client; +import org.opensearch.client.ClusterAdminClient; +import org.opensearch.client.IndicesAdminClient; +import org.opensearch.cluster.ClusterName; +import org.opensearch.cluster.ClusterState; +import org.opensearch.cluster.metadata.IndexMetadata; +import org.opensearch.cluster.metadata.Metadata; +import org.opensearch.cluster.service.ClusterService; +import org.opensearch.common.UUIDs; +import org.opensearch.common.settings.ClusterSettings; +import org.opensearch.common.settings.Settings; +import org.opensearch.core.action.ActionListener; +import org.opensearch.core.index.Index; +import org.opensearch.env.Environment; +import org.opensearch.forecast.settings.ForecastSettings; +import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.function.ExecutorFunction; +import org.opensearch.timeseries.indices.IndexManagement; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.DiscoveryNodeFilterer; + +public class ForecastResultIndexTests extends AbstractTimeSeriesTest { + private ForecastIndexManagement forecastIndices; + private IndicesAdminClient indicesClient; + private ClusterAdminClient clusterAdminClient; + private ClusterName clusterName; + private ClusterState clusterState; + private ClusterService clusterService; + private long defaultMaxDocs; + private int numberOfNodes; + private Client client; + + @Override + public void setUp() throws Exception { + super.setUp(); + client = mock(Client.class); + indicesClient = mock(IndicesAdminClient.class); + AdminClient adminClient = mock(AdminClient.class); + clusterService = mock(ClusterService.class); + ClusterSettings clusterSettings = new ClusterSettings( + Settings.EMPTY, + Collections + .unmodifiableSet( + new HashSet<>( + Arrays + .asList( + ForecastSettings.FORECAST_RESULT_HISTORY_MAX_DOCS_PER_SHARD, + ForecastSettings.FORECAST_RESULT_HISTORY_ROLLOVER_PERIOD, + ForecastSettings.FORECAST_RESULT_HISTORY_RETENTION_PERIOD, + ForecastSettings.FORECAST_MAX_PRIMARY_SHARDS + ) + ) + ) + ); + + clusterName = new ClusterName("test"); + + when(clusterService.getClusterSettings()).thenReturn(clusterSettings); + + ThreadPool threadPool = mock(ThreadPool.class); + Settings settings = Settings.EMPTY; + when(client.admin()).thenReturn(adminClient); + when(adminClient.indices()).thenReturn(indicesClient); + + DiscoveryNodeFilterer nodeFilter = mock(DiscoveryNodeFilterer.class); + numberOfNodes = 2; + when(nodeFilter.getNumberOfEligibleDataNodes()).thenReturn(numberOfNodes); + + forecastIndices = new ForecastIndexManagement( + client, + clusterService, + threadPool, + settings, + nodeFilter, + TimeSeriesSettings.MAX_UPDATE_RETRY_TIMES + ); + + clusterAdminClient = mock(ClusterAdminClient.class); + when(adminClient.cluster()).thenReturn(clusterAdminClient); + + doAnswer(invocation -> { + ClusterStateRequest clusterStateRequest = invocation.getArgument(0); + assertEquals(ForecastIndexManagement.ALL_FORECAST_RESULTS_INDEX_PATTERN, clusterStateRequest.indices()[0]); + @SuppressWarnings("unchecked") + ActionListener listener = (ActionListener) invocation.getArgument(1); + listener.onResponse(new ClusterStateResponse(clusterName, clusterState, true)); + return null; + }).when(clusterAdminClient).state(any(), any()); + + defaultMaxDocs = ForecastSettings.FORECAST_RESULT_HISTORY_MAX_DOCS_PER_SHARD.getDefault(Settings.EMPTY); + + clusterState = ClusterState.builder(clusterName).metadata(Metadata.builder().build()).build(); + when(clusterService.state()).thenReturn(clusterState); + } + + public void testMappingSetToUpdated() throws IOException { + try { + doAnswer(invocation -> { + ActionListener listener = (ActionListener) invocation.getArgument(1); + listener.onResponse(new CreateIndexResponse(true, true, "blah")); + return null; + }).when(indicesClient).create(any(), any()); + + super.setUpLog4jForJUnit(IndexManagement.class); + + ActionListener listener = mock(ActionListener.class); + forecastIndices.initDefaultResultIndexDirectly(listener); + verify(listener, times(1)).onResponse(any(CreateIndexResponse.class)); + assertTrue(testAppender.containsMessage("mapping up-to-date")); + } finally { + super.tearDownLog4jForJUnit(); + } + + } + + public void testInitCustomResultIndexNoAck() { + ExecutorFunction function = mock(ExecutorFunction.class); + ActionListener listener = mock(ActionListener.class); + + doAnswer(invocation -> { + ActionListener createIndexListener = (ActionListener) invocation.getArgument(1); + createIndexListener.onResponse(new CreateIndexResponse(false, false, "blah")); + return null; + }).when(indicesClient).create(any(), any()); + + ArgumentCaptor response = ArgumentCaptor.forClass(Exception.class); + forecastIndices.initCustomResultIndexAndExecute("abc", function, listener); + verify(listener, times(1)).onFailure(response.capture()); + Exception value = response.getValue(); + assertTrue(value instanceof EndRunException); + assertTrue( + "actual: " + value.getMessage(), + value.getMessage().contains("Creating result index with mappings call not acknowledged") + ); + + } + + public void testInitCustomResultIndexAlreadyExist() throws IOException { + ExecutorFunction function = mock(ExecutorFunction.class); + ActionListener listener = mock(ActionListener.class); + + String indexName = "abc"; + + Settings settings = Settings + .builder() + .put(IndexMetadata.SETTING_VERSION_CREATED, Version.CURRENT) + .put(IndexMetadata.SETTING_NUMBER_OF_REPLICAS, 0) + .put(IndexMetadata.SETTING_NUMBER_OF_SHARDS, 1) + .put(IndexMetadata.SETTING_INDEX_UUID, UUIDs.randomBase64UUID()) + .put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString()) + .build(); + IndexMetadata indexMetaData = IndexMetadata + .builder(indexName) + .settings(settings) + .putMapping(ForecastIndexManagement.getResultMappings()) + .build(); + final Map indices = new HashMap<>(); + indices.put(indexName, indexMetaData); + + clusterState = ClusterState.builder(clusterName).metadata(Metadata.builder().indices(indices).build()).build(); + when(clusterService.state()).thenReturn(clusterState); + + doAnswer(invocation -> { + ActionListener createIndexListener = (ActionListener) invocation.getArgument(1); + createIndexListener.onFailure(new ResourceAlreadyExistsException(new Index(indexName, indexName))); + return null; + }).when(indicesClient).create(any(), any()); + + forecastIndices.initCustomResultIndexAndExecute(indexName, function, listener); + verify(listener, never()).onFailure(any()); + } + + public void testInitCustomResultIndexUnknownException() throws IOException { + ExecutorFunction function = mock(ExecutorFunction.class); + ActionListener listener = mock(ActionListener.class); + + String indexName = "abc"; + String exceptionMsg = "blah"; + + doAnswer(invocation -> { + ActionListener createIndexListener = (ActionListener) invocation.getArgument(1); + createIndexListener.onFailure(new IllegalArgumentException(exceptionMsg)); + return null; + }).when(indicesClient).create(any(), any()); + super.setUpLog4jForJUnit(IndexManagement.class); + try { + forecastIndices.initCustomResultIndexAndExecute(indexName, function, listener); + ArgumentCaptor response = ArgumentCaptor.forClass(Exception.class); + verify(listener, times(1)).onFailure(response.capture()); + + Exception value = response.getValue(); + assertTrue(value instanceof IllegalArgumentException); + assertTrue("actual: " + value.getMessage(), value.getMessage().contains(exceptionMsg)); + } finally { + super.tearDownLog4jForJUnit(); + } + } +} diff --git a/src/test/java/org/opensearch/forecast/model/ForecastResultTests.java b/src/test/java/org/opensearch/forecast/model/ForecastResultTests.java new file mode 100644 index 000000000..8e7c4e17a --- /dev/null +++ b/src/test/java/org/opensearch/forecast/model/ForecastResultTests.java @@ -0,0 +1,103 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.model; + +import java.io.IOException; +import java.time.Instant; +import java.util.ArrayList; +import java.util.Collections; +import java.util.List; +import java.util.Optional; + +import org.junit.Before; +import org.opensearch.commons.authuser.User; +import org.opensearch.core.xcontent.ToXContent; +import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.FeatureData; + +public class ForecastResultTests extends OpenSearchTestCase { + List result; + + @Override + @Before + public void setUp() throws Exception { + super.setUp(); + // Arrange + String forecasterId = "testId"; + long intervalMillis = 1000; + Double dataQuality = 0.9; + List featureData = new ArrayList<>(); + featureData.add(new FeatureData("f1", "f1", 1.0d)); + featureData.add(new FeatureData("f2", "f2", 2.0d)); + long currentTimeMillis = System.currentTimeMillis(); + Instant instantFromMillis = Instant.ofEpochMilli(currentTimeMillis); + Instant dataStartTime = instantFromMillis; + Instant dataEndTime = dataStartTime.plusSeconds(10); + Instant executionStartTime = instantFromMillis; + Instant executionEndTime = executionStartTime.plusSeconds(10); + String error = null; + Optional entity = Optional.empty(); + User user = new User("testUser", Collections.emptyList(), Collections.emptyList(), Collections.emptyList()); + Integer schemaVersion = 1; + String modelId = "testModelId"; + float[] forecastsValues = new float[] { 1.0f, 2.0f, 3.0f, 4.0f }; + float[] forecastsUppers = new float[] { 1.5f, 2.5f, 3.5f, 4.5f }; + float[] forecastsLowers = new float[] { 0.5f, 1.5f, 2.5f, 3.5f }; + String taskId = "testTaskId"; + + // Act + result = ForecastResult + .fromRawRCFCasterResult( + forecasterId, + intervalMillis, + dataQuality, + featureData, + dataStartTime, + dataEndTime, + executionStartTime, + executionEndTime, + error, + entity, + user, + schemaVersion, + modelId, + forecastsValues, + forecastsUppers, + forecastsLowers, + taskId + ); + } + + public void testFromRawRCFCasterResult() { + // Assert + assertEquals(5, result.size()); + assertEquals("f1", result.get(1).getFeatureId()); + assertEquals(1.0f, result.get(1).getForecastValue(), 0.01); + assertEquals("f2", result.get(2).getFeatureId()); + assertEquals(2.0f, result.get(2).getForecastValue(), 0.01); + + assertTrue( + "actual: " + result.toString(), + result + .toString() + .contains( + "featureId=f2,dataQuality=0.9,forecastValue=2.0,lowerBound=1.5,upperBound=2.5,confidenceIntervalWidth=1.0,forecastDataStartTime=" + ) + ); + } + + public void testParseAnomalyDetector() throws IOException { + for (int i = 0; i < 5; i++) { + String forecastResultString = TestHelpers + .xContentBuilderToString(result.get(i).toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); + ForecastResult parsedForecastResult = ForecastResult.parse(TestHelpers.parser(forecastResultString)); + assertEquals("Parsing forecast result doesn't work", result.get(i), parsedForecastResult); + assertTrue("Parsing forecast result doesn't work", result.get(i).hashCode() == parsedForecastResult.hashCode()); + } + } +} diff --git a/src/test/java/org/opensearch/forecast/model/ForecastSerializationTests.java b/src/test/java/org/opensearch/forecast/model/ForecastSerializationTests.java new file mode 100644 index 000000000..e83fce6d9 --- /dev/null +++ b/src/test/java/org/opensearch/forecast/model/ForecastSerializationTests.java @@ -0,0 +1,85 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.model; + +import java.io.IOException; +import java.util.Collection; + +import org.opensearch.common.io.stream.BytesStreamOutput; +import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; +import org.opensearch.core.common.io.stream.NamedWriteableRegistry; +import org.opensearch.plugins.Plugin; +import org.opensearch.test.InternalSettingsPlugin; +import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; + +public class ForecastSerializationTests extends OpenSearchSingleNodeTestCase { + @Override + protected Collection> getPlugins() { + return pluginList(InternalSettingsPlugin.class, TimeSeriesAnalyticsPlugin.class); + } + + @Override + protected NamedWriteableRegistry writableRegistry() { + return getInstanceFromNode(NamedWriteableRegistry.class); + } + + public void testStreamConstructor() throws IOException { + Forecaster forecaster = TestHelpers.randomForecaster(); + + BytesStreamOutput output = new BytesStreamOutput(); + + forecaster.writeTo(output); + NamedWriteableAwareStreamInput streamInput = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); + Forecaster parsedForecaster = new Forecaster(streamInput); + assertTrue(parsedForecaster.equals(forecaster)); + } + + public void testStreamConstructorNullUser() throws IOException { + Forecaster forecaster = TestHelpers.ForecasterBuilder.newInstance().setUser(null).build(); + + BytesStreamOutput output = new BytesStreamOutput(); + + forecaster.writeTo(output); + NamedWriteableAwareStreamInput streamInput = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); + Forecaster parsedForecaster = new Forecaster(streamInput); + assertTrue(parsedForecaster.equals(forecaster)); + } + + public void testStreamConstructorNullUiMeta() throws IOException { + Forecaster forecaster = TestHelpers.ForecasterBuilder.newInstance().setUiMetadata(null).build(); + + BytesStreamOutput output = new BytesStreamOutput(); + + forecaster.writeTo(output); + NamedWriteableAwareStreamInput streamInput = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); + Forecaster parsedForecaster = new Forecaster(streamInput); + assertTrue(parsedForecaster.equals(forecaster)); + } + + public void testStreamConstructorNullCustomResult() throws IOException { + Forecaster forecaster = TestHelpers.ForecasterBuilder.newInstance().setCustomResultIndex(null).build(); + + BytesStreamOutput output = new BytesStreamOutput(); + + forecaster.writeTo(output); + NamedWriteableAwareStreamInput streamInput = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); + Forecaster parsedForecaster = new Forecaster(streamInput); + assertTrue(parsedForecaster.equals(forecaster)); + } + + public void testStreamConstructorNullImputationOption() throws IOException { + Forecaster forecaster = TestHelpers.ForecasterBuilder.newInstance().setNullImputationOption().build(); + + BytesStreamOutput output = new BytesStreamOutput(); + + forecaster.writeTo(output); + NamedWriteableAwareStreamInput streamInput = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); + Forecaster parsedForecaster = new Forecaster(streamInput); + assertTrue(parsedForecaster.equals(forecaster)); + } +} diff --git a/src/test/java/org/opensearch/forecast/model/ForecastTaskSerializationTests.java b/src/test/java/org/opensearch/forecast/model/ForecastTaskSerializationTests.java new file mode 100644 index 000000000..28ec18bab --- /dev/null +++ b/src/test/java/org/opensearch/forecast/model/ForecastTaskSerializationTests.java @@ -0,0 +1,121 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.model; + +import java.io.IOException; +import java.util.Collection; + +import org.opensearch.common.io.stream.BytesStreamOutput; +import org.opensearch.core.common.io.stream.NamedWriteableAwareStreamInput; +import org.opensearch.core.common.io.stream.NamedWriteableRegistry; +import org.opensearch.plugins.Plugin; +import org.opensearch.test.InternalSettingsPlugin; +import org.opensearch.test.OpenSearchSingleNodeTestCase; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; + +public class ForecastTaskSerializationTests extends OpenSearchSingleNodeTestCase { + private BytesStreamOutput output; + + @Override + protected Collection> getPlugins() { + return pluginList(InternalSettingsPlugin.class, TimeSeriesAnalyticsPlugin.class); + } + + @Override + protected NamedWriteableRegistry writableRegistry() { + return getInstanceFromNode(NamedWriteableRegistry.class); + } + + @Override + public void setUp() throws Exception { + super.setUp(); + + output = new BytesStreamOutput(); + } + + public void testConstructor_allFieldsPresent() throws IOException { + // Set up a StreamInput that contains all fields + ForecastTask originalTask = TestHelpers.ForecastTaskBuilder.newInstance().build(); + + originalTask.writeTo(output); + // required by AggregationBuilder in Feature's constructor for named writeable + NamedWriteableAwareStreamInput streamInput = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); + + ForecastTask readTask = new ForecastTask(streamInput); + + assertEquals("task123", readTask.getTaskId()); + assertEquals("FORECAST_HISTORICAL_HC_ENTITY", readTask.getTaskType()); + assertTrue(readTask.isEntityTask()); + assertEquals("config123", readTask.getConfigId()); + assertEquals(originalTask.getForecaster(), readTask.getForecaster()); + assertEquals("Running", readTask.getState()); + assertEquals(Float.valueOf(0.5f), readTask.getTaskProgress()); + assertEquals(Float.valueOf(0.1f), readTask.getInitProgress()); + assertEquals(originalTask.getCurrentPiece(), readTask.getCurrentPiece()); + assertEquals(originalTask.getExecutionStartTime(), readTask.getExecutionStartTime()); + assertEquals(originalTask.getExecutionEndTime(), readTask.getExecutionEndTime()); + assertEquals(Boolean.TRUE, readTask.isLatest()); + assertEquals("No errors", readTask.getError()); + assertEquals("checkpoint1", readTask.getCheckpointId()); + assertEquals(originalTask.getLastUpdateTime(), readTask.getLastUpdateTime()); + assertEquals("user1", readTask.getStartedBy()); + assertEquals("user2", readTask.getStoppedBy()); + assertEquals("node1", readTask.getCoordinatingNode()); + assertEquals("node2", readTask.getWorkerNode()); + assertEquals(originalTask.getUser(), readTask.getUser()); + assertEquals(originalTask.getDateRange(), readTask.getDateRange()); + assertEquals(originalTask.getEntity(), readTask.getEntity()); + // since entity attributes are random, we cannot have a fixed model id to verify + assertTrue(readTask.getEntityModelId().startsWith("config123_entity_")); + assertEquals("parentTask1", readTask.getParentTaskId()); + assertEquals(Integer.valueOf(10), readTask.getEstimatedMinutesLeft()); + } + + public void testConstructor_missingOptionalFields() throws IOException { + // Set up a StreamInput that contains all fields + ForecastTask originalTask = TestHelpers.ForecastTaskBuilder + .newInstance() + .setForecaster(null) + .setUser(null) + .setDateRange(null) + .setEntity(null) + .build(); + + originalTask.writeTo(output); + // required by AggregationBuilder in Feature's constructor for named writeable + NamedWriteableAwareStreamInput streamInput = new NamedWriteableAwareStreamInput(output.bytes().streamInput(), writableRegistry()); + + ForecastTask readTask = new ForecastTask(streamInput); + + assertEquals("task123", readTask.getTaskId()); + assertEquals("FORECAST_HISTORICAL_HC_ENTITY", readTask.getTaskType()); + assertTrue(readTask.isEntityTask()); + assertEquals("config123", readTask.getConfigId()); + assertEquals(null, readTask.getForecaster()); + assertEquals("Running", readTask.getState()); + assertEquals(Float.valueOf(0.5f), readTask.getTaskProgress()); + assertEquals(Float.valueOf(0.1f), readTask.getInitProgress()); + assertEquals(originalTask.getCurrentPiece(), readTask.getCurrentPiece()); + assertEquals(originalTask.getExecutionStartTime(), readTask.getExecutionStartTime()); + assertEquals(originalTask.getExecutionEndTime(), readTask.getExecutionEndTime()); + assertEquals(Boolean.TRUE, readTask.isLatest()); + assertEquals("No errors", readTask.getError()); + assertEquals("checkpoint1", readTask.getCheckpointId()); + assertEquals(originalTask.getLastUpdateTime(), readTask.getLastUpdateTime()); + assertEquals("user1", readTask.getStartedBy()); + assertEquals("user2", readTask.getStoppedBy()); + assertEquals("node1", readTask.getCoordinatingNode()); + assertEquals("node2", readTask.getWorkerNode()); + assertEquals(null, readTask.getUser()); + assertEquals(null, readTask.getDateRange()); + assertEquals(null, readTask.getEntity()); + assertEquals(null, readTask.getEntityModelId()); + assertEquals("parentTask1", readTask.getParentTaskId()); + assertEquals(Integer.valueOf(10), readTask.getEstimatedMinutesLeft()); + } + +} diff --git a/src/test/java/org/opensearch/forecast/model/ForecastTaskTests.java b/src/test/java/org/opensearch/forecast/model/ForecastTaskTests.java new file mode 100644 index 000000000..bce749757 --- /dev/null +++ b/src/test/java/org/opensearch/forecast/model/ForecastTaskTests.java @@ -0,0 +1,38 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.model; + +import java.io.IOException; + +import org.opensearch.core.xcontent.ToXContent; +import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.TestHelpers; + +public class ForecastTaskTests extends OpenSearchTestCase { + public void testParse() throws IOException { + ForecastTask originalTask = TestHelpers.ForecastTaskBuilder.newInstance().build(); + String forecastTaskString = TestHelpers + .xContentBuilderToString(originalTask.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); + ForecastTask parsedForecastTask = ForecastTask.parse(TestHelpers.parser(forecastTaskString)); + assertEquals("Parsing forecast task doesn't work", originalTask, parsedForecastTask); + } + + public void testParseEmptyForecaster() throws IOException { + ForecastTask originalTask = TestHelpers.ForecastTaskBuilder.newInstance().setForecaster(null).build(); + String forecastTaskString = TestHelpers + .xContentBuilderToString(originalTask.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); + ForecastTask parsedForecastTask = ForecastTask.parse(TestHelpers.parser(forecastTaskString)); + assertEquals("Parsing forecast task doesn't work", originalTask, parsedForecastTask); + } + + public void testParseEmptyForecasterRange() throws IOException { + ForecastTask originalTask = TestHelpers.ForecastTaskBuilder.newInstance().setForecaster(null).setDateRange(null).build(); + String forecastTaskString = TestHelpers + .xContentBuilderToString(originalTask.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); + ForecastTask parsedForecastTask = ForecastTask.parse(TestHelpers.parser(forecastTaskString)); + assertEquals("Parsing forecast task doesn't work", originalTask, parsedForecastTask); + } +} diff --git a/src/test/java/org/opensearch/forecast/model/ForecastTaskTypeTests.java b/src/test/java/org/opensearch/forecast/model/ForecastTaskTypeTests.java new file mode 100644 index 000000000..4ee403a0e --- /dev/null +++ b/src/test/java/org/opensearch/forecast/model/ForecastTaskTypeTests.java @@ -0,0 +1,53 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.model; + +import java.util.Arrays; + +import org.opensearch.test.OpenSearchTestCase; + +public class ForecastTaskTypeTests extends OpenSearchTestCase { + + public void testHistoricalForecasterTaskTypes() { + assertEquals( + Arrays.asList(ForecastTaskType.FORECAST_HISTORICAL_HC_FORECASTER, ForecastTaskType.FORECAST_HISTORICAL_SINGLE_STREAM), + ForecastTaskType.HISTORICAL_FORECASTER_TASK_TYPES + ); + } + + public void testAllHistoricalTaskTypes() { + assertEquals( + Arrays + .asList( + ForecastTaskType.FORECAST_HISTORICAL_HC_FORECASTER, + ForecastTaskType.FORECAST_HISTORICAL_SINGLE_STREAM, + ForecastTaskType.FORECAST_HISTORICAL_HC_ENTITY + ), + ForecastTaskType.ALL_HISTORICAL_TASK_TYPES + ); + } + + public void testRealtimeTaskTypes() { + assertEquals( + Arrays.asList(ForecastTaskType.FORECAST_REALTIME_SINGLE_STREAM, ForecastTaskType.FORECAST_REALTIME_HC_FORECASTER), + ForecastTaskType.REALTIME_TASK_TYPES + ); + } + + public void testAllForecastTaskTypes() { + assertEquals( + Arrays + .asList( + ForecastTaskType.FORECAST_REALTIME_SINGLE_STREAM, + ForecastTaskType.FORECAST_REALTIME_HC_FORECASTER, + ForecastTaskType.FORECAST_HISTORICAL_SINGLE_STREAM, + ForecastTaskType.FORECAST_HISTORICAL_HC_FORECASTER, + ForecastTaskType.FORECAST_HISTORICAL_HC_ENTITY + ), + ForecastTaskType.ALL_FORECAST_TASK_TYPES + ); + } +} diff --git a/src/test/java/org/opensearch/forecast/model/ForecasterTests.java b/src/test/java/org/opensearch/forecast/model/ForecasterTests.java new file mode 100644 index 000000000..0b64912bf --- /dev/null +++ b/src/test/java/org/opensearch/forecast/model/ForecasterTests.java @@ -0,0 +1,396 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.model; + +import static org.hamcrest.CoreMatchers.containsString; +import static org.hamcrest.Matchers.is; + +import java.io.IOException; +import java.time.Instant; +import java.time.temporal.ChronoUnit; +import java.util.Arrays; +import java.util.Collections; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +import org.hamcrest.MatcherAssert; +import org.opensearch.commons.authuser.User; +import org.opensearch.core.xcontent.ToXContent; +import org.opensearch.forecast.constant.ForecastCommonMessages; +import org.opensearch.forecast.constant.ForecastCommonName; +import org.opensearch.index.query.MatchAllQueryBuilder; +import org.opensearch.index.query.QueryBuilders; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.common.exception.ValidationException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.dataprocessor.ImputationOption; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.TimeConfiguration; +import org.opensearch.timeseries.model.ValidationAspect; +import org.opensearch.timeseries.model.ValidationIssueType; + +public class ForecasterTests extends AbstractTimeSeriesTest { + TimeConfiguration forecastInterval = new IntervalTimeConfiguration(1, ChronoUnit.MINUTES); + TimeConfiguration windowDelay = new IntervalTimeConfiguration(1, ChronoUnit.MINUTES); + String forecasterId = "testId"; + Long version = 1L; + String name = "testName"; + String description = "testDescription"; + String timeField = "testTimeField"; + List indices = Collections.singletonList("testIndex"); + List features = Collections.emptyList(); // Assuming no features for simplicity + MatchAllQueryBuilder filterQuery = QueryBuilders.matchAllQuery(); + Integer shingleSize = 1; + Map uiMetadata = new HashMap<>(); + Integer schemaVersion = 1; + Instant lastUpdateTime = Instant.now(); + List categoryFields = Arrays.asList("field1", "field2"); + User user = new User("testUser", Collections.emptyList(), Collections.emptyList(), Collections.emptyList()); + String resultIndex = null; + Integer horizon = 1; + + public void testForecasterConstructor() { + ImputationOption imputationOption = TestHelpers.randomImputationOption(); + + Forecaster forecaster = new Forecaster( + forecasterId, + version, + name, + description, + timeField, + indices, + features, + filterQuery, + forecastInterval, + windowDelay, + shingleSize, + uiMetadata, + schemaVersion, + lastUpdateTime, + categoryFields, + user, + resultIndex, + horizon, + imputationOption + ); + + assertEquals(forecasterId, forecaster.getId()); + assertEquals(version, forecaster.getVersion()); + assertEquals(name, forecaster.getName()); + assertEquals(description, forecaster.getDescription()); + assertEquals(timeField, forecaster.getTimeField()); + assertEquals(indices, forecaster.getIndices()); + assertEquals(features, forecaster.getFeatureAttributes()); + assertEquals(filterQuery, forecaster.getFilterQuery()); + assertEquals(forecastInterval, forecaster.getInterval()); + assertEquals(windowDelay, forecaster.getWindowDelay()); + assertEquals(shingleSize, forecaster.getShingleSize()); + assertEquals(uiMetadata, forecaster.getUiMetadata()); + assertEquals(schemaVersion, forecaster.getSchemaVersion()); + assertEquals(lastUpdateTime, forecaster.getLastUpdateTime()); + assertEquals(categoryFields, forecaster.getCategoryFields()); + assertEquals(user, forecaster.getUser()); + assertEquals(resultIndex, forecaster.getCustomResultIndex()); + assertEquals(horizon, forecaster.getHorizon()); + assertEquals(imputationOption, forecaster.getImputationOption()); + } + + public void testForecasterConstructorWithNullForecastInterval() { + TimeConfiguration forecastInterval = null; + + ValidationException ex = expectThrows(ValidationException.class, () -> { + new Forecaster( + forecasterId, + version, + name, + description, + timeField, + indices, + features, + filterQuery, + forecastInterval, + windowDelay, + shingleSize, + uiMetadata, + schemaVersion, + lastUpdateTime, + categoryFields, + user, + resultIndex, + horizon, + TestHelpers.randomImputationOption() + ); + }); + + MatcherAssert.assertThat(ex.getMessage(), containsString(ForecastCommonMessages.NULL_FORECAST_INTERVAL)); + MatcherAssert.assertThat(ex.getType(), is(ValidationIssueType.FORECAST_INTERVAL)); + MatcherAssert.assertThat(ex.getAspect(), is(ValidationAspect.FORECASTER)); + } + + public void testNegativeInterval() { + var forecastInterval = new IntervalTimeConfiguration(0, ChronoUnit.MINUTES); // An interval less than or equal to zero + + ValidationException ex = expectThrows(ValidationException.class, () -> { + new Forecaster( + forecasterId, + version, + name, + description, + timeField, + indices, + features, + filterQuery, + forecastInterval, + windowDelay, + shingleSize, + uiMetadata, + schemaVersion, + lastUpdateTime, + categoryFields, + user, + resultIndex, + horizon, + TestHelpers.randomImputationOption() + ); + }); + + MatcherAssert.assertThat(ex.getMessage(), containsString(ForecastCommonMessages.INVALID_FORECAST_INTERVAL)); + MatcherAssert.assertThat(ex.getType(), is(ValidationIssueType.FORECAST_INTERVAL)); + MatcherAssert.assertThat(ex.getAspect(), is(ValidationAspect.FORECASTER)); + } + + public void testMaxCategoryFieldsLimits() { + List categoryFields = Arrays.asList("field1", "field2", "field3"); + + ValidationException ex = expectThrows(ValidationException.class, () -> { + new Forecaster( + forecasterId, + version, + name, + description, + timeField, + indices, + features, + filterQuery, + forecastInterval, + windowDelay, + shingleSize, + uiMetadata, + schemaVersion, + lastUpdateTime, + categoryFields, + user, + resultIndex, + horizon, + TestHelpers.randomImputationOption() + ); + }); + + MatcherAssert.assertThat(ex.getMessage(), containsString(CommonMessages.getTooManyCategoricalFieldErr(2))); + MatcherAssert.assertThat(ex.getType(), is(ValidationIssueType.CATEGORY)); + MatcherAssert.assertThat(ex.getAspect(), is(ValidationAspect.FORECASTER)); + } + + public void testBlankName() { + String name = ""; + + ValidationException ex = expectThrows(ValidationException.class, () -> { + new Forecaster( + forecasterId, + version, + name, + description, + timeField, + indices, + features, + filterQuery, + forecastInterval, + windowDelay, + shingleSize, + uiMetadata, + schemaVersion, + lastUpdateTime, + categoryFields, + user, + resultIndex, + horizon, + TestHelpers.randomImputationOption() + ); + }); + + MatcherAssert.assertThat(ex.getMessage(), containsString(CommonMessages.EMPTY_NAME)); + MatcherAssert.assertThat(ex.getType(), is(ValidationIssueType.NAME)); + MatcherAssert.assertThat(ex.getAspect(), is(ValidationAspect.FORECASTER)); + } + + public void testInvalidCustomResultIndex() { + String resultIndex = "test"; + + ValidationException ex = expectThrows(ValidationException.class, () -> { + new Forecaster( + forecasterId, + version, + name, + description, + timeField, + indices, + features, + filterQuery, + forecastInterval, + windowDelay, + shingleSize, + uiMetadata, + schemaVersion, + lastUpdateTime, + categoryFields, + user, + resultIndex, + horizon, + TestHelpers.randomImputationOption() + ); + }); + + MatcherAssert.assertThat(ex.getMessage(), containsString(ForecastCommonMessages.INVALID_RESULT_INDEX_PREFIX)); + MatcherAssert.assertThat(ex.getType(), is(ValidationIssueType.RESULT_INDEX)); + MatcherAssert.assertThat(ex.getAspect(), is(ValidationAspect.FORECASTER)); + } + + public void testValidCustomResultIndex() { + String resultIndex = ForecastCommonName.CUSTOM_RESULT_INDEX_PREFIX + "test"; + + var forecaster = new Forecaster( + forecasterId, + version, + name, + description, + timeField, + indices, + features, + filterQuery, + forecastInterval, + windowDelay, + shingleSize, + uiMetadata, + schemaVersion, + lastUpdateTime, + categoryFields, + user, + resultIndex, + horizon, + TestHelpers.randomImputationOption() + ); + + assertEquals(resultIndex, forecaster.getCustomResultIndex()); + } + + public void testInvalidHorizon() { + int horizon = 0; + + ValidationException ex = expectThrows(ValidationException.class, () -> { + new Forecaster( + forecasterId, + version, + name, + description, + timeField, + indices, + features, + filterQuery, + forecastInterval, + windowDelay, + shingleSize, + uiMetadata, + schemaVersion, + lastUpdateTime, + categoryFields, + user, + resultIndex, + horizon, + TestHelpers.randomImputationOption() + ); + }); + + MatcherAssert.assertThat(ex.getMessage(), containsString("Horizon size must be a positive integer no larger than")); + MatcherAssert.assertThat(ex.getType(), is(ValidationIssueType.SHINGLE_SIZE_FIELD)); + MatcherAssert.assertThat(ex.getAspect(), is(ValidationAspect.FORECASTER)); + } + + public void testParse() throws IOException { + Forecaster forecaster = TestHelpers.randomForecaster(); + String forecasterString = TestHelpers + .xContentBuilderToString(forecaster.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); + LOG.info(forecasterString); + Forecaster parsedForecaster = Forecaster.parse(TestHelpers.parser(forecasterString)); + assertEquals("Parsing forecaster doesn't work", forecaster, parsedForecaster); + } + + public void testParseEmptyMetaData() throws IOException { + Forecaster forecaster = TestHelpers.ForecasterBuilder.newInstance().setUiMetadata(null).build(); + String forecasterString = TestHelpers + .xContentBuilderToString(forecaster.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); + LOG.info(forecasterString); + Forecaster parsedForecaster = Forecaster.parse(TestHelpers.parser(forecasterString)); + assertEquals("Parsing forecaster doesn't work", forecaster, parsedForecaster); + } + + public void testParseNullLastUpdateTime() throws IOException { + Forecaster forecaster = TestHelpers.ForecasterBuilder.newInstance().setLastUpdateTime(null).build(); + String forecasterString = TestHelpers + .xContentBuilderToString(forecaster.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); + LOG.info(forecasterString); + Forecaster parsedForecaster = Forecaster.parse(TestHelpers.parser(forecasterString)); + assertEquals("Parsing forecaster doesn't work", forecaster, parsedForecaster); + } + + public void testParseNullCategoryFields() throws IOException { + Forecaster forecaster = TestHelpers.ForecasterBuilder.newInstance().setCategoryFields(null).build(); + String forecasterString = TestHelpers + .xContentBuilderToString(forecaster.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); + LOG.info(forecasterString); + Forecaster parsedForecaster = Forecaster.parse(TestHelpers.parser(forecasterString)); + assertEquals("Parsing forecaster doesn't work", forecaster, parsedForecaster); + } + + public void testParseNullUser() throws IOException { + Forecaster forecaster = TestHelpers.ForecasterBuilder.newInstance().setUser(null).build(); + String forecasterString = TestHelpers + .xContentBuilderToString(forecaster.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); + LOG.info(forecasterString); + Forecaster parsedForecaster = Forecaster.parse(TestHelpers.parser(forecasterString)); + assertEquals("Parsing forecaster doesn't work", forecaster, parsedForecaster); + } + + public void testParseNullCustomResultIndex() throws IOException { + Forecaster forecaster = TestHelpers.ForecasterBuilder.newInstance().setCustomResultIndex(null).build(); + String forecasterString = TestHelpers + .xContentBuilderToString(forecaster.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); + LOG.info(forecasterString); + Forecaster parsedForecaster = Forecaster.parse(TestHelpers.parser(forecasterString)); + assertEquals("Parsing forecaster doesn't work", forecaster, parsedForecaster); + } + + public void testParseNullImpute() throws IOException { + Forecaster forecaster = TestHelpers.ForecasterBuilder.newInstance().setNullImputationOption().build(); + String forecasterString = TestHelpers + .xContentBuilderToString(forecaster.toXContent(TestHelpers.builder(), ToXContent.EMPTY_PARAMS)); + LOG.info(forecasterString); + Forecaster parsedForecaster = Forecaster.parse(TestHelpers.parser(forecasterString)); + assertEquals("Parsing forecaster doesn't work", forecaster, parsedForecaster); + } + + public void testGetImputer() throws IOException { + Forecaster forecaster = TestHelpers.randomForecaster(); + assertTrue(null != forecaster.getImputer()); + } + + public void testGetImputerNullImputer() throws IOException { + Forecaster forecaster = TestHelpers.ForecasterBuilder.newInstance().setNullImputationOption().build(); + assertTrue(null != forecaster.getImputer()); + } +} diff --git a/src/test/java/org/opensearch/forecast/settings/ForecastEnabledSettingTests.java b/src/test/java/org/opensearch/forecast/settings/ForecastEnabledSettingTests.java new file mode 100644 index 000000000..dda3a8761 --- /dev/null +++ b/src/test/java/org/opensearch/forecast/settings/ForecastEnabledSettingTests.java @@ -0,0 +1,30 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.settings; + +import org.opensearch.test.OpenSearchTestCase; + +public class ForecastEnabledSettingTests extends OpenSearchTestCase { + + public void testIsForecastEnabled() { + assertTrue(ForecastEnabledSetting.isForecastEnabled()); + ForecastEnabledSetting.getInstance().setSettingValue(ForecastEnabledSetting.FORECAST_ENABLED, false); + assertTrue(!ForecastEnabledSetting.isForecastEnabled()); + } + + public void testIsForecastBreakerEnabled() { + assertTrue(ForecastEnabledSetting.isForecastBreakerEnabled()); + ForecastEnabledSetting.getInstance().setSettingValue(ForecastEnabledSetting.FORECAST_BREAKER_ENABLED, false); + assertTrue(!ForecastEnabledSetting.isForecastBreakerEnabled()); + } + + public void testIsDoorKeeperInCacheEnabled() { + assertTrue(!ForecastEnabledSetting.isDoorKeeperInCacheEnabled()); + ForecastEnabledSetting.getInstance().setSettingValue(ForecastEnabledSetting.FORECAST_DOOR_KEEPER_IN_CACHE_ENABLED, true); + assertTrue(ForecastEnabledSetting.isDoorKeeperInCacheEnabled()); + } + +} diff --git a/src/test/java/org/opensearch/forecast/settings/ForecastNumericSettingTests.java b/src/test/java/org/opensearch/forecast/settings/ForecastNumericSettingTests.java new file mode 100644 index 000000000..80b2202bf --- /dev/null +++ b/src/test/java/org/opensearch/forecast/settings/ForecastNumericSettingTests.java @@ -0,0 +1,50 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.forecast.settings; + +import java.util.HashMap; +import java.util.Map; + +import org.junit.Before; +import org.opensearch.common.settings.Setting; +import org.opensearch.test.OpenSearchTestCase; + +public class ForecastNumericSettingTests extends OpenSearchTestCase { + private ForecastNumericSetting forecastSetting; + + @Override + @Before + public void setUp() throws Exception { + super.setUp(); + forecastSetting = ForecastNumericSetting.getInstance(); + } + + public void testMaxCategoricalFields() { + forecastSetting.setSettingValue(ForecastNumericSetting.CATEGORY_FIELD_LIMIT, 3); + int value = ForecastNumericSetting.maxCategoricalFields(); + assertEquals("Expected value is 3", 3, value); + } + + public void testGetSettingValue() { + Map> settingsMap = new HashMap<>(); + Setting testSetting = Setting.intSetting("test.setting", 1, Setting.Property.NodeScope); + settingsMap.put("test.setting", testSetting); + forecastSetting = new ForecastNumericSetting(settingsMap); + + forecastSetting.setSettingValue("test.setting", 2); + Integer value = forecastSetting.getSettingValue("test.setting"); + assertEquals("Expected value is 2", 2, value.intValue()); + } + + public void testGetSettingNonexistentKey() { + try { + forecastSetting.getSettingValue("nonexistent.key"); + fail("Expected an IllegalArgumentException to be thrown"); + } catch (IllegalArgumentException e) { + assertEquals("Cannot find setting by key [nonexistent.key]", e.getMessage()); + } + } +} diff --git a/src/test/java/org/opensearch/search/aggregations/metrics/CardinalityProfileTests.java b/src/test/java/org/opensearch/search/aggregations/metrics/CardinalityProfileTests.java index af49c4a81..7f024ef6d 100644 --- a/src/test/java/org/opensearch/search/aggregations/metrics/CardinalityProfileTests.java +++ b/src/test/java/org/opensearch/search/aggregations/metrics/CardinalityProfileTests.java @@ -16,8 +16,6 @@ import static org.mockito.Mockito.doThrow; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; -import static org.opensearch.ad.model.AnomalyDetector.ANOMALY_DETECTORS_INDEX; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; import java.io.IOException; import java.time.temporal.ChronoUnit; @@ -32,16 +30,11 @@ import org.opensearch.action.search.SearchResponse; import org.opensearch.ad.AbstractProfileRunnerTests; import org.opensearch.ad.AnomalyDetectorProfileRunner; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.ad.transport.ProfileAction; import org.opensearch.ad.transport.ProfileNodeResponse; import org.opensearch.ad.transport.ProfileResponse; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.cluster.ClusterName; import org.opensearch.common.settings.Settings; import org.opensearch.common.util.BigArrays; @@ -49,6 +42,13 @@ import org.opensearch.core.action.ActionListener; import org.opensearch.search.aggregations.InternalAggregation; import org.opensearch.search.aggregations.InternalAggregations; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.util.SecurityClientUtil; /** * Run tests in ES package since InternalCardinality has only package private constructors @@ -73,10 +73,10 @@ private void setUpMultiEntityClientGet(DetectorStatus detectorStatus, JobStatus .randomAnomalyDetectorWithInterval(new IntervalTimeConfiguration(detectorIntervalMin, ChronoUnit.MINUTES), true); NodeStateManager nodeStateManager = mock(NodeStateManager.class); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(nodeStateManager).getAnomalyDetector(anyString(), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(anyString(), eq(AnalysisType.AD), any(ActionListener.class)); clientUtil = new SecurityClientUtil(nodeStateManager, Settings.EMPTY); runner = new AnomalyDetectorProfileRunner( client, @@ -93,33 +93,27 @@ private void setUpMultiEntityClientGet(DetectorStatus detectorStatus, JobStatus GetRequest request = (GetRequest) args[0]; ActionListener listener = (ActionListener) args[1]; - if (request.index().equals(ANOMALY_DETECTORS_INDEX)) { + if (request.index().equals(CommonName.CONFIG_INDEX)) { switch (detectorStatus) { case EXIST: - listener - .onResponse( - TestHelpers.createGetResponse(detector, detector.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX) - ); + listener.onResponse(TestHelpers.createGetResponse(detector, detector.getId(), CommonName.CONFIG_INDEX)); break; default: assertTrue("should not reach here", false); break; } - } else if (request.index().equals(ANOMALY_DETECTOR_JOB_INDEX)) { - AnomalyDetectorJob job = null; + } else if (request.index().equals(CommonName.JOB_INDEX)) { + Job job = null; switch (jobStatus) { case ENABLED: job = TestHelpers.randomAnomalyDetectorJob(true); - listener - .onResponse( - TestHelpers.createGetResponse(job, detector.getDetectorId(), AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX) - ); + listener.onResponse(TestHelpers.createGetResponse(job, detector.getId(), CommonName.JOB_INDEX)); break; default: assertTrue("should not reach here", false); break; } - } else if (request.index().equals(CommonName.DETECTION_STATE_INDEX)) { + } else if (request.index().equals(ADCommonName.DETECTION_STATE_INDEX)) { switch (errorResultStatus) { case NO_ERROR: listener.onResponse(null); @@ -145,7 +139,7 @@ private void setUpMultiEntityClientSearch(ADResultStatus resultStatus, Cardinali Object[] args = invocation.getArguments(); ActionListener listener = (ActionListener) args[1]; SearchRequest request = (SearchRequest) args[0]; - if (request.indices()[0].equals(CommonName.ANOMALY_RESULT_INDEX_ALIAS)) { + if (request.indices()[0].equals(ADCommonName.ANOMALY_RESULT_INDEX_ALIAS)) { switch (resultStatus) { case NO_RESULT: SearchResponse mockResponse = mock(SearchResponse.class); @@ -175,7 +169,7 @@ private void setUpMultiEntityClientSearch(ADResultStatus resultStatus, Cardinali for (int i = 0; i < 100; i++) { hyperLogLog.collect(0, BitMixer.mix64(randomIntBetween(1, 100))); } - aggs.add(new InternalCardinality(CommonName.TOTAL_ENTITIES, hyperLogLog, new HashMap<>())); + aggs.add(new InternalCardinality(ADCommonName.TOTAL_ENTITIES, hyperLogLog, new HashMap<>())); when(response.getAggregations()).thenReturn(InternalAggregations.from(aggs)); listener.onResponse(response); break; @@ -220,7 +214,7 @@ public void testFailGetEntityStats() throws IOException, InterruptedException { final CountDownLatch inProgressLatch = new CountDownLatch(1); - runner.profile(detector.getDetectorId(), ActionListener.wrap(response -> { + runner.profile(detector.getId(), ActionListener.wrap(response -> { assertTrue("Should not reach here ", false); inProgressLatch.countDown(); }, exception -> { @@ -242,7 +236,7 @@ public void testNoResultsNoError() throws IOException, InterruptedException { final AtomicInteger called = new AtomicInteger(0); - runner.profile(detector.getDetectorId(), ActionListener.wrap(response -> { + runner.profile(detector.getId(), ActionListener.wrap(response -> { assertTrue(response.getInitProgress() != null); called.getAndIncrement(); }, exception -> { @@ -264,7 +258,7 @@ public void testFailConfirmInitted() throws IOException, InterruptedException { final CountDownLatch inProgressLatch = new CountDownLatch(1); - runner.profile(detector.getDetectorId(), ActionListener.wrap(response -> { + runner.profile(detector.getId(), ActionListener.wrap(response -> { assertTrue("Should not reach here ", false); inProgressLatch.countDown(); }, exception -> { diff --git a/src/test/java/org/opensearch/ad/AbstractADTest.java b/src/test/java/org/opensearch/timeseries/AbstractTimeSeriesTest.java similarity index 96% rename from src/test/java/org/opensearch/ad/AbstractADTest.java rename to src/test/java/org/opensearch/timeseries/AbstractTimeSeriesTest.java index 93e972174..d625971bf 100644 --- a/src/test/java/org/opensearch/ad/AbstractADTest.java +++ b/src/test/java/org/opensearch/timeseries/AbstractTimeSeriesTest.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad; +package org.opensearch.timeseries; import static org.hamcrest.Matchers.containsString; import static org.mockito.ArgumentMatchers.any; @@ -43,7 +43,6 @@ import org.opensearch.Version; import org.opensearch.action.support.PlainActionFuture; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.ad.model.DetectorInternalState; import org.opensearch.cluster.metadata.AliasMetadata; @@ -64,14 +63,15 @@ import org.opensearch.threadpool.FixedExecutorBuilder; import org.opensearch.threadpool.TestThreadPool; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.model.Job; import org.opensearch.transport.TransportInterceptor; import org.opensearch.transport.TransportService; import test.org.opensearch.ad.util.FakeNode; -public class AbstractADTest extends OpenSearchTestCase { +public class AbstractTimeSeriesTest extends OpenSearchTestCase { - protected static final Logger LOG = (Logger) LogManager.getLogger(AbstractADTest.class); + protected static final Logger LOG = (Logger) LogManager.getLogger(AbstractTimeSeriesTest.class); // transport test node protected int nodesCount; @@ -213,7 +213,7 @@ private String convertToRegex(String formattedStr) { protected TestAppender testAppender; - Logger logger; + protected Logger logger; /** * Set up test with junit that a warning was logged with log4j @@ -260,10 +260,10 @@ protected static void setUpThreadPool(String name) { name, new FixedExecutorBuilder( Settings.EMPTY, - AnomalyDetectorPlugin.AD_THREAD_POOL_NAME, + TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME, 1, 1000, - "opensearch.ad." + AnomalyDetectorPlugin.AD_THREAD_POOL_NAME + "opensearch.ad." + TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME ) ); } @@ -351,7 +351,7 @@ protected NamedXContentRegistry xContentRegistry() { AnomalyDetector.XCONTENT_REGISTRY, AnomalyResult.XCONTENT_REGISTRY, DetectorInternalState.XCONTENT_REGISTRY, - AnomalyDetectorJob.XCONTENT_REGISTRY + Job.XCONTENT_REGISTRY ) ); return new NamedXContentRegistry(entries); @@ -445,7 +445,7 @@ protected IndexMetadata indexMeta(String name, long creationDate, String... alia protected void setUpADThreadPool(ThreadPool mockThreadPool) { ExecutorService executorService = mock(ExecutorService.class); - when(mockThreadPool.executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); + when(mockThreadPool.executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); doAnswer(invocation -> { Runnable runnable = invocation.getArgument(0); runnable.run(); diff --git a/src/test/java/org/opensearch/timeseries/DataByFeatureIdTests.java b/src/test/java/org/opensearch/timeseries/DataByFeatureIdTests.java new file mode 100644 index 000000000..631ba99e3 --- /dev/null +++ b/src/test/java/org/opensearch/timeseries/DataByFeatureIdTests.java @@ -0,0 +1,81 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries; + +import java.io.IOException; + +import org.junit.Before; +import org.opensearch.common.io.stream.BytesStreamOutput; +import org.opensearch.common.xcontent.XContentFactory; +import org.opensearch.core.common.io.stream.StreamInput; +import org.opensearch.core.xcontent.ToXContent; +import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.model.DataByFeatureId; + +public class DataByFeatureIdTests extends OpenSearchTestCase { + String expectedFeatureId = "testFeature"; + Double expectedData = 123.45; + DataByFeatureId dataByFeatureId; + + @Before + public void setup() { + dataByFeatureId = new DataByFeatureId(expectedFeatureId, expectedData); + } + + public void testInputOutputStream() throws IOException { + + BytesStreamOutput output = new BytesStreamOutput(); + dataByFeatureId.writeTo(output); + StreamInput streamInput = output.bytes().streamInput(); + DataByFeatureId restoredDataByFeatureId = new DataByFeatureId(streamInput); + assertEquals(expectedFeatureId, restoredDataByFeatureId.getFeatureId()); + assertEquals(expectedData, restoredDataByFeatureId.getData()); + } + + public void testToXContent() throws IOException { + XContentBuilder builder = XContentFactory.jsonBuilder(); + + dataByFeatureId.toXContent(builder, ToXContent.EMPTY_PARAMS); + + XContentParser parser = createParser(builder); + // advance to first token + XContentParser.Token token = parser.nextToken(); + if (token != XContentParser.Token.START_OBJECT) { + throw new IOException("Expected data to start with an Object"); + } + + DataByFeatureId parsedDataByFeatureId = DataByFeatureId.parse(parser); + + assertEquals(expectedFeatureId, parsedDataByFeatureId.getFeatureId()); + assertEquals(expectedData, parsedDataByFeatureId.getData()); + } + + public void testEqualsAndHashCode() { + DataByFeatureId dataByFeatureId1 = new DataByFeatureId("feature1", 1.0); + DataByFeatureId dataByFeatureId2 = new DataByFeatureId("feature1", 1.0); + DataByFeatureId dataByFeatureId3 = new DataByFeatureId("feature2", 2.0); + + // Test equal objects are equal + assertEquals(dataByFeatureId1, dataByFeatureId2); + assertEquals(dataByFeatureId1.hashCode(), dataByFeatureId2.hashCode()); + + // Test unequal objects are not equal + assertNotEquals(dataByFeatureId1, dataByFeatureId3); + assertNotEquals(dataByFeatureId1.hashCode(), dataByFeatureId3.hashCode()); + + // Test object is not equal to null + assertNotEquals(dataByFeatureId1, null); + + // Test object is not equal to object of different type + assertNotEquals(dataByFeatureId1, "string"); + + // Test object is equal to itself + assertEquals(dataByFeatureId1, dataByFeatureId1); + assertEquals(dataByFeatureId1.hashCode(), dataByFeatureId1.hashCode()); + } +} diff --git a/src/test/java/org/opensearch/ad/NodeStateManagerTests.java b/src/test/java/org/opensearch/timeseries/NodeStateManagerTests.java similarity index 66% rename from src/test/java/org/opensearch/ad/NodeStateManagerTests.java rename to src/test/java/org/opensearch/timeseries/NodeStateManagerTests.java index 1924e7cfe..e52255818 100644 --- a/src/test/java/org/opensearch/ad/NodeStateManagerTests.java +++ b/src/test/java/org/opensearch/timeseries/NodeStateManagerTests.java @@ -9,8 +9,9 @@ * GitHub history for details. */ -package org.opensearch.ad; +package org.opensearch.timeseries; +import static org.hamcrest.Matchers.equalTo; import static org.mockito.ArgumentMatchers.any; import static org.mockito.Mockito.doAnswer; import static org.mockito.Mockito.mock; @@ -18,8 +19,6 @@ import static org.mockito.Mockito.verify; import static org.mockito.Mockito.verifyNoInteractions; import static org.mockito.Mockito.when; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.BACKOFF_MINUTES; -import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE; import java.io.IOException; import java.time.Clock; @@ -29,9 +28,12 @@ import java.util.Collections; import java.util.HashSet; import java.util.Locale; +import java.util.Optional; import java.util.Set; import java.util.concurrent.CountDownLatch; import java.util.concurrent.TimeUnit; +import java.util.concurrent.atomic.AtomicReference; +import java.util.function.Consumer; import java.util.stream.IntStream; import org.junit.After; @@ -39,15 +41,11 @@ import org.junit.Before; import org.junit.BeforeClass; import org.opensearch.Version; +import org.opensearch.action.LatchedActionListener; import org.opensearch.action.get.GetRequest; import org.opensearch.action.get.GetResponse; -import org.opensearch.action.search.SearchRequest; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.AnomalyDetectorJob; -import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.ad.transport.AnomalyResultTests; -import org.opensearch.ad.util.ClientUtil; -import org.opensearch.ad.util.Throttler; import org.opensearch.client.Client; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.node.DiscoveryNodeRole; @@ -58,20 +56,25 @@ import org.opensearch.common.unit.TimeValue; import org.opensearch.core.action.ActionListener; import org.opensearch.core.xcontent.NamedXContentRegistry; +import org.opensearch.forecast.model.Forecaster; import org.opensearch.search.SearchModule; import org.opensearch.test.ClusterServiceUtils; import org.opensearch.test.OpenSearchTestCase; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.ClientUtil; import com.google.common.collect.ImmutableMap; -public class NodeStateManagerTests extends AbstractADTest { +public class NodeStateManagerTests extends AbstractTimeSeriesTest { private NodeStateManager stateManager; private Client client; private ClientUtil clientUtil; private Clock clock; private Duration duration; - private Throttler throttler; private ThreadPool context; private AnomalyDetector detectorToCheck; private Settings settings; @@ -81,7 +84,7 @@ public class NodeStateManagerTests extends AbstractADTest { private GetResponse checkpointResponse; private ClusterService clusterService; private ClusterSettings clusterSettings; - private AnomalyDetectorJob jobToCheck; + private Job jobToCheck; @Override protected NamedXContentRegistry xContentRegistry() { @@ -106,18 +109,17 @@ public void setUp() throws Exception { client = mock(Client.class); settings = Settings .builder() - .put("plugins.anomaly_detection.max_retry_for_unresponsive_node", 3) - .put("plugins.anomaly_detection.ad_mute_minutes", TimeValue.timeValueMinutes(10)) + .put("plugins.timeseries.max_retry_for_unresponsive_node", 3) + .put("plugins.timeseries.backoff_minutes", TimeValue.timeValueMinutes(10)) .build(); clock = mock(Clock.class); duration = Duration.ofHours(1); context = TestHelpers.createThreadPool(); - throttler = new Throttler(clock); - clientUtil = new ClientUtil(Settings.EMPTY, client, throttler, mock(ThreadPool.class)); + clientUtil = new ClientUtil(client); Set> nodestateSetting = new HashSet<>(ClusterSettings.BUILT_IN_CLUSTER_SETTINGS); - nodestateSetting.add(MAX_RETRY_FOR_UNRESPONSIVE_NODE); - nodestateSetting.add(BACKOFF_MINUTES); + nodestateSetting.add(TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE); + nodestateSetting.add(TimeSeriesSettings.BACKOFF_MINUTES); clusterSettings = new ClusterSettings(Settings.EMPTY, nodestateSetting); DiscoveryNode discoveryNode = new DiscoveryNode( @@ -129,7 +131,17 @@ public void setUp() throws Exception { ); clusterService = ClusterServiceUtils.createClusterService(threadPool, discoveryNode, clusterSettings); - stateManager = new NodeStateManager(client, xContentRegistry(), settings, clientUtil, clock, duration, clusterService); + stateManager = new NodeStateManager( + client, + xContentRegistry(), + settings, + clientUtil, + clock, + duration, + clusterService, + TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE, + TimeSeriesSettings.BACKOFF_MINUTES + ); checkpointResponse = mock(GetResponse.class); jobToCheck = TestHelpers.randomAnomalyDetectorJob(true, Instant.ofEpochMilli(1602401500000L), null); @@ -166,14 +178,11 @@ private String setupDetector() throws IOException { } assertTrue(request != null && listener != null); - listener - .onResponse( - TestHelpers.createGetResponse(detectorToCheck, detectorToCheck.getDetectorId(), AnomalyDetector.ANOMALY_DETECTORS_INDEX) - ); + listener.onResponse(TestHelpers.createGetResponse(detectorToCheck, detectorToCheck.getId(), CommonName.CONFIG_INDEX)); return null; }).when(client).get(any(), any(ActionListener.class)); - return detectorToCheck.getDetectorId(); + return detectorToCheck.getId(); } @SuppressWarnings("unchecked") @@ -203,13 +212,6 @@ private void setupCheckpoint(boolean responseExists) throws IOException { }).when(client).get(any(), any(ActionListener.class)); } - public void testGetLastError() throws IOException, InterruptedException { - String error = "blah"; - assertEquals(NodeStateManager.NO_ERROR, stateManager.getLastDetectionError(adId)); - stateManager.setLastDetectionError(adId, error); - assertEquals(error, stateManager.getLastDetectionError(adId)); - } - public void testShouldMute() { assertTrue(!stateManager.isMuted(nodeId, adId)); @@ -235,28 +237,11 @@ public void testMaintenanceDoNothing() { verifyNoInteractions(clock); } - public void testHasRunningQuery() throws IOException { - stateManager = new NodeStateManager( - client, - xContentRegistry(), - settings, - new ClientUtil(settings, client, throttler, context), - clock, - duration, - clusterService - ); - - AnomalyDetector detector = TestHelpers.randomAnomalyDetector(ImmutableMap.of(), null); - SearchRequest dummySearchRequest = new SearchRequest(); - assertFalse(stateManager.hasRunningQuery(detector)); - throttler.insertFilteredQuery(detector.getDetectorId(), dummySearchRequest); - assertTrue(stateManager.hasRunningQuery(detector)); - } - public void testGetAnomalyDetector() throws IOException, InterruptedException { String detectorId = setupDetector(); + final CountDownLatch inProgressLatch = new CountDownLatch(1); - stateManager.getAnomalyDetector(detectorId, ActionListener.wrap(asDetector -> { + stateManager.getConfig(detectorId, AnalysisType.AD, ActionListener.wrap(asDetector -> { assertEquals(detectorToCheck, asDetector.get()); inProgressLatch.countDown(); }, exception -> { @@ -276,7 +261,7 @@ public void testRepeatedGetAnomalyDetector() throws IOException, InterruptedExce String detectorId = setupDetector(); final CountDownLatch inProgressLatch = new CountDownLatch(2); - stateManager.getAnomalyDetector(detectorId, ActionListener.wrap(asDetector -> { + stateManager.getConfig(detectorId, AnalysisType.AD, ActionListener.wrap(asDetector -> { assertEquals(detectorToCheck, asDetector.get()); inProgressLatch.countDown(); }, exception -> { @@ -284,7 +269,7 @@ public void testRepeatedGetAnomalyDetector() throws IOException, InterruptedExce inProgressLatch.countDown(); })); - stateManager.getAnomalyDetector(detectorId, ActionListener.wrap(asDetector -> { + stateManager.getConfig(detectorId, AnalysisType.AD, ActionListener.wrap(asDetector -> { assertEquals(detectorToCheck, asDetector.get()); inProgressLatch.countDown(); }, exception -> { @@ -362,7 +347,7 @@ public void testSettingUpdateMaxRetry() { // In setUp method, we mute after 3 tries assertTrue(!stateManager.isMuted(nodeId, adId)); - Settings newSettings = Settings.builder().put(AnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE.getKey(), "1").build(); + Settings newSettings = Settings.builder().put(TimeSeriesSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE.getKey(), "1").build(); Settings.Builder target = Settings.builder(); clusterSettings.updateDynamicSettings(newSettings, target, Settings.builder(), "test"); clusterSettings.applySettings(target.build()); @@ -380,7 +365,7 @@ public void testSettingUpdateBackOffMin() { assertTrue(stateManager.isMuted(nodeId, adId)); - Settings newSettings = Settings.builder().put(AnomalyDetectorSettings.BACKOFF_MINUTES.getKey(), "1m").build(); + Settings newSettings = Settings.builder().put(TimeSeriesSettings.BACKOFF_MINUTES.getKey(), "1m").build(); Settings.Builder target = Settings.builder(); clusterSettings.updateDynamicSettings(newSettings, target, Settings.builder(), "test"); clusterSettings.applySettings(target.build()); @@ -399,8 +384,8 @@ private String setupJob() throws IOException { doAnswer(invocation -> { GetRequest request = invocation.getArgument(0); ActionListener listener = invocation.getArgument(1); - if (request.index().equals(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX)) { - listener.onResponse(TestHelpers.createGetResponse(jobToCheck, detectorId, AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX)); + if (request.index().equals(CommonName.JOB_INDEX)) { + listener.onResponse(TestHelpers.createGetResponse(jobToCheck, detectorId, CommonName.JOB_INDEX)); } return null; }).when(client).get(any(), any(ActionListener.class)); @@ -411,7 +396,7 @@ private String setupJob() throws IOException { public void testGetAnomalyJob() throws IOException, InterruptedException { String detectorId = setupJob(); final CountDownLatch inProgressLatch = new CountDownLatch(1); - stateManager.getAnomalyDetectorJob(detectorId, ActionListener.wrap(asDetector -> { + stateManager.getJob(detectorId, ActionListener.wrap(asDetector -> { assertEquals(jobToCheck, asDetector.get()); inProgressLatch.countDown(); }, exception -> { @@ -431,7 +416,7 @@ public void testRepeatedGetAnomalyJob() throws IOException, InterruptedException String detectorId = setupJob(); final CountDownLatch inProgressLatch = new CountDownLatch(2); - stateManager.getAnomalyDetectorJob(detectorId, ActionListener.wrap(asDetector -> { + stateManager.getJob(detectorId, ActionListener.wrap(asDetector -> { assertEquals(jobToCheck, asDetector.get()); inProgressLatch.countDown(); }, exception -> { @@ -439,7 +424,7 @@ public void testRepeatedGetAnomalyJob() throws IOException, InterruptedException inProgressLatch.countDown(); })); - stateManager.getAnomalyDetectorJob(detectorId, ActionListener.wrap(asDetector -> { + stateManager.getJob(detectorId, ActionListener.wrap(asDetector -> { assertEquals(jobToCheck, asDetector.get()); inProgressLatch.countDown(); }, exception -> { @@ -451,4 +436,118 @@ public void testRepeatedGetAnomalyJob() throws IOException, InterruptedException verify(client, times(1)).get(any(), any(ActionListener.class)); } + + public void testGetConfigAD() throws IOException, InterruptedException { + String configId = "123"; + AnomalyDetector detector = TestHelpers.randomAnomalyDetector(ImmutableMap.of("testKey", "testValue"), Instant.now()); + GetResponse getResponse = TestHelpers.createGetResponse(detector, configId, CommonName.CONFIG_INDEX); + doAnswer(invocationOnMock -> { + ((ActionListener) invocationOnMock.getArguments()[1]).onResponse(getResponse); + return null; + }).when(client).get(any(GetRequest.class), any()); + + final AtomicReference actualResponse = new AtomicReference<>(); + final AtomicReference exception = new AtomicReference<>(); + ActionListener listener = new ActionListener<>() { + @Override + public void onResponse(AnomalyDetector resultResponse) { + actualResponse.set(resultResponse); + } + + @Override + public void onFailure(Exception e) { + exception.set(e); + } + }; + + CountDownLatch latch = new CountDownLatch(1); + ActionListener latchListener = new LatchedActionListener<>(listener, latch); + + Consumer> function = mock(Consumer.class); + doAnswer(invocationOnMock -> { + Optional receivedDetector = (Optional) invocationOnMock.getArguments()[0]; + latchListener.onResponse(receivedDetector.get()); + return null; + }).when(function).accept(any(Optional.class)); + + stateManager.getConfig(configId, AnalysisType.AD, function, latchListener); + assertTrue(latch.await(30L, TimeUnit.SECONDS)); + assertNotNull(actualResponse.get()); + assertNull(exception.get()); + org.hamcrest.MatcherAssert.assertThat(actualResponse.get(), equalTo(detector)); + } + + public void testGetConfigForecaster() throws IOException, InterruptedException { + String configId = "123"; + Forecaster forecaster = TestHelpers.randomForecaster(); + GetResponse getResponse = TestHelpers.createGetResponse(forecaster, configId, CommonName.CONFIG_INDEX); + doAnswer(invocationOnMock -> { + ((ActionListener) invocationOnMock.getArguments()[1]).onResponse(getResponse); + return null; + }).when(client).get(any(GetRequest.class), any()); + + final AtomicReference actualResponse = new AtomicReference<>(); + final AtomicReference exception = new AtomicReference<>(); + ActionListener listener = new ActionListener<>() { + @Override + public void onResponse(Forecaster resultResponse) { + actualResponse.set(resultResponse); + } + + @Override + public void onFailure(Exception e) { + exception.set(e); + } + }; + + CountDownLatch latch = new CountDownLatch(1); + ActionListener latchListener = new LatchedActionListener<>(listener, latch); + + Consumer> function = mock(Consumer.class); + doAnswer(invocationOnMock -> { + Optional receivedDetector = (Optional) invocationOnMock.getArguments()[0]; + latchListener.onResponse(receivedDetector.get()); + return null; + }).when(function).accept(any(Optional.class)); + + stateManager.getConfig(configId, AnalysisType.FORECAST, function, latchListener); + assertTrue(latch.await(30L, TimeUnit.SECONDS)); + assertNotNull(actualResponse.get()); + assertNull(exception.get()); + org.hamcrest.MatcherAssert.assertThat(actualResponse.get(), equalTo(forecaster)); + } + + public void testGetConfigException() throws IOException, InterruptedException { + String configId = "123"; + Exception testException = new Exception("Test exception"); + doAnswer(invocationOnMock -> { + ((ActionListener) invocationOnMock.getArguments()[1]).onFailure(testException); + return null; + }).when(client).get(any(GetRequest.class), any()); + + final AtomicReference actualResponse = new AtomicReference<>(); + final AtomicReference exception = new AtomicReference<>(); + ActionListener listener = new ActionListener<>() { + @Override + public void onResponse(Forecaster resultResponse) { + actualResponse.set(resultResponse); + } + + @Override + public void onFailure(Exception e) { + exception.set(e); + } + }; + + CountDownLatch latch = new CountDownLatch(1); + ActionListener latchListener = new LatchedActionListener<>(listener, latch); + + Consumer> function = mock(Consumer.class); + + stateManager.getConfig(configId, AnalysisType.FORECAST, function, latchListener); + assertTrue(latch.await(30L, TimeUnit.SECONDS)); + assertNull(actualResponse.get()); + assertNotNull(exception.get()); + assertEquals("Test exception", exception.get().getMessage()); + } } diff --git a/src/test/java/org/opensearch/ad/NodeStateTests.java b/src/test/java/org/opensearch/timeseries/NodeStateTests.java similarity index 82% rename from src/test/java/org/opensearch/ad/NodeStateTests.java rename to src/test/java/org/opensearch/timeseries/NodeStateTests.java index c2958dcbf..f97b288d7 100644 --- a/src/test/java/org/opensearch/ad/NodeStateTests.java +++ b/src/test/java/org/opensearch/timeseries/NodeStateTests.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad; +package org.opensearch.timeseries; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; @@ -19,8 +19,8 @@ import java.time.Duration; import java.time.Instant; -import org.opensearch.ad.common.exception.AnomalyDetectionException; import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.common.exception.TimeSeriesException; public class NodeStateTests extends OpenSearchTestCase { private NodeState state; @@ -37,7 +37,7 @@ public void setUp() throws Exception { public void testMaintenanceNotRemoveSingle() throws IOException { when(clock.instant()).thenReturn(Instant.ofEpochMilli(1000)); - state.setDetectorDef(TestHelpers.randomAnomalyDetector(TestHelpers.randomUiMetadata(), null)); + state.setConfigDef(TestHelpers.randomAnomalyDetector(TestHelpers.randomUiMetadata(), null)); when(clock.instant()).thenReturn(Instant.MIN); assertTrue(!state.expired(duration)); @@ -45,8 +45,8 @@ public void testMaintenanceNotRemoveSingle() throws IOException { public void testMaintenanceNotRemove() throws IOException { when(clock.instant()).thenReturn(Instant.ofEpochSecond(1000)); - state.setDetectorDef(TestHelpers.randomAnomalyDetector(TestHelpers.randomUiMetadata(), null)); - state.setLastDetectionError(null); + state.setConfigDef(TestHelpers.randomAnomalyDetector(TestHelpers.randomUiMetadata(), null)); + state.setException(null); when(clock.instant()).thenReturn(Instant.ofEpochSecond(3700)); assertTrue(!state.expired(duration)); @@ -55,11 +55,11 @@ public void testMaintenanceNotRemove() throws IOException { public void testMaintenanceRemoveLastError() throws IOException { when(clock.instant()).thenReturn(Instant.ofEpochMilli(1000)); state - .setDetectorDef( + .setConfigDef( TestHelpers.randomAnomalyDetector(TestHelpers.randomUiMetadata(), null) ); - state.setLastDetectionError(null); + state.setException(null); when(clock.instant()).thenReturn(Instant.ofEpochSecond(3700)); assertTrue(state.expired(duration)); @@ -67,7 +67,7 @@ public void testMaintenanceRemoveLastError() throws IOException { public void testMaintenancRemoveDetector() throws IOException { when(clock.instant()).thenReturn(Instant.MIN); - state.setDetectorDef(TestHelpers.randomAnomalyDetector(TestHelpers.randomUiMetadata(), null)); + state.setConfigDef(TestHelpers.randomAnomalyDetector(TestHelpers.randomUiMetadata(), null)); when(clock.instant()).thenReturn(Instant.MAX); assertTrue(state.expired(duration)); @@ -89,14 +89,14 @@ public void testMaintenancFlagRemove() throws IOException { public void testMaintenanceLastColdStartRemoved() { when(clock.instant()).thenReturn(Instant.ofEpochMilli(1000)); - state.setException(new AnomalyDetectionException("123", "")); + state.setException(new TimeSeriesException("123", "")); when(clock.instant()).thenReturn(Instant.ofEpochSecond(3700)); assertTrue(state.expired(duration)); } public void testMaintenanceLastColdStartNotRemoved() { when(clock.instant()).thenReturn(Instant.ofEpochMilli(1_000_000L)); - state.setException(new AnomalyDetectionException("123", "")); + state.setException(new TimeSeriesException("123", "")); when(clock.instant()).thenReturn(Instant.ofEpochSecond(3700)); assertTrue(!state.expired(duration)); } diff --git a/src/test/java/org/opensearch/ad/TestHelpers.java b/src/test/java/org/opensearch/timeseries/TestHelpers.java similarity index 78% rename from src/test/java/org/opensearch/ad/TestHelpers.java rename to src/test/java/org/opensearch/timeseries/TestHelpers.java index ea0109e68..65b6898e0 100644 --- a/src/test/java/org/opensearch/ad/TestHelpers.java +++ b/src/test/java/org/opensearch/timeseries/TestHelpers.java @@ -9,17 +9,23 @@ * GitHub history for details. */ -package org.opensearch.ad; +package org.opensearch.timeseries; import static org.apache.http.entity.ContentType.APPLICATION_JSON; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; -import static org.opensearch.ad.model.AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX; import static org.opensearch.cluster.node.DiscoveryNodeRole.BUILT_IN_ROLES; import static org.opensearch.core.xcontent.XContentParserUtils.ensureExpectedToken; import static org.opensearch.index.query.AbstractQueryBuilder.parseInnerQueryBuilder; import static org.opensearch.index.seqno.SequenceNumbers.UNASSIGNED_SEQ_NO; -import static org.opensearch.test.OpenSearchTestCase.*; +import static org.opensearch.test.OpenSearchTestCase.buildNewFakeTransportAddress; +import static org.opensearch.test.OpenSearchTestCase.randomAlphaOfLength; +import static org.opensearch.test.OpenSearchTestCase.randomBoolean; +import static org.opensearch.test.OpenSearchTestCase.randomDouble; +import static org.opensearch.test.OpenSearchTestCase.randomDoubleBetween; +import static org.opensearch.test.OpenSearchTestCase.randomInt; +import static org.opensearch.test.OpenSearchTestCase.randomIntBetween; +import static org.opensearch.test.OpenSearchTestCase.randomLong; import java.io.IOException; import java.nio.ByteBuffer; @@ -33,10 +39,12 @@ import java.util.List; import java.util.Locale; import java.util.Map; +import java.util.Optional; import java.util.Random; import java.util.Set; import java.util.concurrent.Callable; import java.util.function.Consumer; +import java.util.stream.DoubleStream; import java.util.stream.IntStream; import org.apache.http.Header; @@ -57,36 +65,23 @@ import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; import org.opensearch.action.search.ShardSearchFailure; -import org.opensearch.ad.constant.CommonErrorMessages; -import org.opensearch.ad.constant.CommonName; +import org.opensearch.ad.constant.ADCommonName; import org.opensearch.ad.constant.CommonValue; import org.opensearch.ad.feature.Features; -import org.opensearch.ad.indices.AnomalyDetectionIndices; +import org.opensearch.ad.indices.ADIndexManagement; import org.opensearch.ad.ml.ThresholdingResult; import org.opensearch.ad.mock.model.MockSimpleLog; import org.opensearch.ad.model.ADTask; -import org.opensearch.ad.model.ADTaskState; import org.opensearch.ad.model.ADTaskType; import org.opensearch.ad.model.AnomalyDetector; import org.opensearch.ad.model.AnomalyDetectorExecutionInput; -import org.opensearch.ad.model.AnomalyDetectorJob; import org.opensearch.ad.model.AnomalyResult; import org.opensearch.ad.model.AnomalyResultBucket; -import org.opensearch.ad.model.DataByFeatureId; -import org.opensearch.ad.model.DetectionDateRange; import org.opensearch.ad.model.DetectorInternalState; import org.opensearch.ad.model.DetectorValidationIssue; -import org.opensearch.ad.model.DetectorValidationIssueType; -import org.opensearch.ad.model.Entity; import org.opensearch.ad.model.ExpectedValueList; -import org.opensearch.ad.model.Feature; -import org.opensearch.ad.model.FeatureData; -import org.opensearch.ad.model.IntervalTimeConfiguration; -import org.opensearch.ad.model.TimeConfiguration; -import org.opensearch.ad.model.ValidationAspect; import org.opensearch.ad.ratelimit.RequestPriority; import org.opensearch.ad.ratelimit.ResultWriteRequest; -import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.client.AdminClient; import org.opensearch.client.Client; import org.opensearch.client.Request; @@ -104,6 +99,7 @@ import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.CheckedConsumer; import org.opensearch.common.Priority; +import org.opensearch.common.Randomness; import org.opensearch.common.UUIDs; import org.opensearch.common.settings.ClusterSettings; import org.opensearch.common.settings.Settings; @@ -122,6 +118,8 @@ import org.opensearch.core.xcontent.ToXContentObject; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.forecast.model.ForecastTask; +import org.opensearch.forecast.model.Forecaster; import org.opensearch.index.get.GetResult; import org.opensearch.index.query.BoolQueryBuilder; import org.opensearch.index.query.MatchAllQueryBuilder; @@ -138,8 +136,26 @@ import org.opensearch.search.profile.SearchProfileShardResults; import org.opensearch.search.suggest.Suggest; import org.opensearch.test.ClusterServiceUtils; +import org.opensearch.test.OpenSearchTestCase; import org.opensearch.test.rest.OpenSearchRestTestCase; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.dataprocessor.ImputationMethod; +import org.opensearch.timeseries.dataprocessor.ImputationOption; +import org.opensearch.timeseries.model.Config; +import org.opensearch.timeseries.model.DataByFeatureId; +import org.opensearch.timeseries.model.DateRange; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.FeatureData; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.model.Job; +import org.opensearch.timeseries.model.TaskState; +import org.opensearch.timeseries.model.TimeConfiguration; +import org.opensearch.timeseries.model.ValidationAspect; +import org.opensearch.timeseries.model.ValidationIssueType; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import com.google.common.collect.ImmutableList; import com.google.common.collect.ImmutableMap; @@ -153,12 +169,12 @@ public class TestHelpers { public static final String AD_BASE_PREVIEW_URI = AD_BASE_DETECTORS_URI + "/%s/_preview"; public static final String AD_BASE_STATS_URI = "/_plugins/_anomaly_detection/stats"; public static ImmutableSet HISTORICAL_ANALYSIS_RUNNING_STATS = ImmutableSet - .of(ADTaskState.CREATED.name(), ADTaskState.INIT.name(), ADTaskState.RUNNING.name()); + .of(TaskState.CREATED.name(), TaskState.INIT.name(), TaskState.RUNNING.name()); // Task may fail if memory circuit breaker triggered. public static final Set HISTORICAL_ANALYSIS_FINISHED_FAILED_STATS = ImmutableSet - .of(ADTaskState.FINISHED.name(), ADTaskState.FAILED.name()); + .of(TaskState.FINISHED.name(), TaskState.FAILED.name()); public static ImmutableSet HISTORICAL_ANALYSIS_DONE_STATS = ImmutableSet - .of(ADTaskState.FAILED.name(), ADTaskState.FINISHED.name(), ADTaskState.STOPPED.name()); + .of(TaskState.FAILED.name(), TaskState.FINISHED.name(), TaskState.STOPPED.name()); private static final Logger logger = LogManager.getLogger(TestHelpers.class); public static final Random random = new Random(42); @@ -311,7 +327,8 @@ public static AnomalyDetector randomAnomalyDetector( lastUpdateTime, categoryFields, user, - null + null, + TestHelpers.randomImputationOption() ); } @@ -355,12 +372,13 @@ public static AnomalyDetector randomDetector( Instant.now(), categoryFields, null, - resultIndex + resultIndex, + TestHelpers.randomImputationOption() ); } - public static DetectionDateRange randomDetectionDateRange() { - return new DetectionDateRange( + public static DateRange randomDetectionDateRange() { + return new DateRange( Instant.now().truncatedTo(ChronoUnit.SECONDS).minus(10, ChronoUnit.DAYS), Instant.now().truncatedTo(ChronoUnit.SECONDS) ); @@ -403,13 +421,14 @@ public static AnomalyDetector randomAnomalyDetectorUsingCategoryFields( randomQuery(), randomIntervalTimeConfiguration(), new IntervalTimeConfiguration(0, ChronoUnit.MINUTES), - randomIntBetween(1, AnomalyDetectorSettings.MAX_SHINGLE_SIZE), + randomIntBetween(1, TimeSeriesSettings.MAX_SHINGLE_SIZE), null, randomInt(), Instant.now(), categoryFields, randomUser(), - resultIndex + resultIndex, + TestHelpers.randomImputationOption() ); } @@ -433,13 +452,14 @@ public static AnomalyDetector randomAnomalyDetector(String timefield, String ind randomQuery(), randomIntervalTimeConfiguration(), randomIntervalTimeConfiguration(), - randomIntBetween(1, AnomalyDetectorSettings.MAX_SHINGLE_SIZE), + randomIntBetween(1, TimeSeriesSettings.MAX_SHINGLE_SIZE), null, randomInt(), Instant.now(), null, randomUser(), - null + null, + TestHelpers.randomImputationOption() ); } @@ -455,13 +475,14 @@ public static AnomalyDetector randomAnomalyDetectorWithEmptyFeature() throws IOE randomQuery(), randomIntervalTimeConfiguration(), randomIntervalTimeConfiguration(), - randomIntBetween(1, AnomalyDetectorSettings.MAX_SHINGLE_SIZE), + randomIntBetween(1, TimeSeriesSettings.MAX_SHINGLE_SIZE), null, randomInt(), Instant.now().truncatedTo(ChronoUnit.SECONDS), null, randomUser(), - null + null, + TestHelpers.randomImputationOption() ); } @@ -482,13 +503,14 @@ public static AnomalyDetector randomAnomalyDetectorWithInterval(TimeConfiguratio randomQuery(), interval, randomIntervalTimeConfiguration(), - randomIntBetween(1, AnomalyDetectorSettings.MAX_SHINGLE_SIZE), + randomIntBetween(1, TimeSeriesSettings.MAX_SHINGLE_SIZE), null, randomInt(), Instant.now().truncatedTo(ChronoUnit.SECONDS), categoryField, randomUser(), - null + null, + TestHelpers.randomImputationOption() ); } @@ -509,13 +531,14 @@ public static class AnomalyDetectorBuilder { private QueryBuilder filterQuery; private TimeConfiguration detectionInterval = randomIntervalTimeConfiguration(); private TimeConfiguration windowDelay = randomIntervalTimeConfiguration(); - private Integer shingleSize = randomIntBetween(1, AnomalyDetectorSettings.MAX_SHINGLE_SIZE); + private Integer shingleSize = randomIntBetween(1, TimeSeriesSettings.MAX_SHINGLE_SIZE); private Map uiMetadata = null; private Integer schemaVersion = randomInt(); private Instant lastUpdateTime = Instant.now().truncatedTo(ChronoUnit.SECONDS); private List categoryFields = null; private User user = randomUser(); private String resultIndex = null; + private ImputationOption imputationOption = null; public static AnomalyDetectorBuilder newInstance() throws IOException { return new AnomalyDetectorBuilder(); @@ -610,6 +633,11 @@ public AnomalyDetectorBuilder setResultIndex(String resultIndex) { return this; } + public AnomalyDetectorBuilder setImputationOption(ImputationMethod method, Optional defaultFill, boolean integerSentive) { + this.imputationOption = new ImputationOption(method, defaultFill, integerSentive); + return this; + } + public AnomalyDetector build() { return new AnomalyDetector( detectorId, @@ -628,7 +656,8 @@ public AnomalyDetector build() { lastUpdateTime, categoryFields, user, - resultIndex + resultIndex, + imputationOption ); } } @@ -647,13 +676,14 @@ public static AnomalyDetector randomAnomalyDetectorWithInterval(TimeConfiguratio randomQuery(), interval, randomIntervalTimeConfiguration(), - randomIntBetween(1, AnomalyDetectorSettings.MAX_SHINGLE_SIZE), + randomIntBetween(1, TimeSeriesSettings.MAX_SHINGLE_SIZE), null, randomInt(), Instant.now().truncatedTo(ChronoUnit.SECONDS), categoryField, randomUser(), - null + null, + TestHelpers.randomImputationOption() ); } @@ -849,7 +879,7 @@ public static AnomalyResult randomAnomalyDetectResult(double score, String error Instant.now().truncatedTo(ChronoUnit.SECONDS), Instant.now().truncatedTo(ChronoUnit.SECONDS), error, - null, + Optional.empty(), user, CommonValue.NO_SCHEMA_VERSION, null, @@ -929,8 +959,8 @@ public static AnomalyResult randomHCADAnomalyDetectResult( endTimeEpochMillis == null ? Instant.now().truncatedTo(ChronoUnit.SECONDS) : Instant.ofEpochMilli(endTimeEpochMillis), error, entityAttrs == null - ? Entity.createSingleAttributeEntity(randomAlphaOfLength(5), randomAlphaOfLength(5)) - : Entity.createEntityByReordering(entityAttrs), + ? Optional.ofNullable(Entity.createSingleAttributeEntity(randomAlphaOfLength(5), randomAlphaOfLength(5))) + : Optional.ofNullable(Entity.createEntityByReordering(entityAttrs)), randomUser(), CommonValue.NO_SCHEMA_VERSION, null, @@ -942,12 +972,12 @@ public static AnomalyResult randomHCADAnomalyDetectResult( ); } - public static AnomalyDetectorJob randomAnomalyDetectorJob() { + public static Job randomAnomalyDetectorJob() { return randomAnomalyDetectorJob(true); } - public static AnomalyDetectorJob randomAnomalyDetectorJob(boolean enabled, Instant enabledTime, Instant disabledTime) { - return new AnomalyDetectorJob( + public static Job randomAnomalyDetectorJob(boolean enabled, Instant enabledTime, Instant disabledTime) { + return new Job( randomAlphaOfLength(10), randomIntervalSchedule(), randomIntervalTimeConfiguration(), @@ -961,7 +991,7 @@ public static AnomalyDetectorJob randomAnomalyDetectorJob(boolean enabled, Insta ); } - public static AnomalyDetectorJob randomAnomalyDetectorJob(boolean enabled) { + public static Job randomAnomalyDetectorJob(boolean enabled) { return randomAnomalyDetectorJob( enabled, Instant.now().truncatedTo(ChronoUnit.SECONDS), @@ -1106,8 +1136,8 @@ public static void createEmptyIndexMapping(RestClient client, String indexName, } public static void createEmptyAnomalyResultIndex(RestClient client) throws IOException { - createEmptyIndex(client, CommonName.ANOMALY_RESULT_INDEX_ALIAS); - createIndexMapping(client, CommonName.ANOMALY_RESULT_INDEX_ALIAS, toHttpEntity(AnomalyDetectionIndices.getAnomalyResultMappings())); + createEmptyIndex(client, ADCommonName.ANOMALY_RESULT_INDEX_ALIAS); + createIndexMapping(client, ADCommonName.ANOMALY_RESULT_INDEX_ALIAS, toHttpEntity(ADIndexManagement.getResultMappings())); } public static void createEmptyIndex(RestClient client, String indexName) throws IOException { @@ -1240,7 +1270,7 @@ public static Map categoryField = detector.getCategoryField(); + List categoryField = detector.getCategoryFields(); if (categoryField != null) { if (categoryField.size() == 1) { entity = Entity.createSingleAttributeEntity(categoryField.get(0), randomAlphaOfLength(5)); @@ -1286,7 +1316,7 @@ public static ADTask randomAdTask( .builder() .taskId(taskId) .taskType(adTaskType.name()) - .detectorId(detectorId) + .configId(detectorId) .detector(detector) .state(state.name()) .taskProgress(0.5f) @@ -1305,14 +1335,14 @@ public static ADTask randomAdTask( return task; } - public static ADTask randomAdTask(String taskId, ADTaskState state, Instant executionEndTime, String stoppedBy, boolean withDetector) + public static ADTask randomAdTask(String taskId, TaskState state, Instant executionEndTime, String stoppedBy, boolean withDetector) throws IOException { return randomAdTask(taskId, state, executionEndTime, stoppedBy, withDetector, ADTaskType.HISTORICAL_SINGLE_ENTITY); } public static ADTask randomAdTask( String taskId, - ADTaskState state, + TaskState state, Instant executionEndTime, String stoppedBy, boolean withDetector, @@ -1343,7 +1373,7 @@ public static ADTask randomAdTask( .builder() .taskId(taskId) .taskType(adTaskType.name()) - .detectorId(randomAlphaOfLength(5)) + .configId(randomAlphaOfLength(5)) .detector(detector) .entity(entity) .state(state.name()) @@ -1365,28 +1395,19 @@ public static ADTask randomAdTask( public static ADTask randomAdTask( String taskId, - ADTaskState state, + TaskState state, Instant executionEndTime, String stoppedBy, AnomalyDetector detector ) { executionEndTime = executionEndTime == null ? null : executionEndTime.truncatedTo(ChronoUnit.SECONDS); - Entity entity = null; - if (detector != null) { - if (detector.isMultiCategoryDetector()) { - Map attrMap = new HashMap<>(); - detector.getCategoryField().stream().forEach(f -> attrMap.put(f, randomAlphaOfLength(5))); - entity = Entity.createEntityByReordering(attrMap); - } else if (detector.isMultientityDetector()) { - entity = Entity.createEntityByReordering(ImmutableMap.of(detector.getCategoryField().get(0), randomAlphaOfLength(5))); - } - } + Entity entity = randomEntity(detector); String taskType = entity == null ? ADTaskType.HISTORICAL_SINGLE_ENTITY.name() : ADTaskType.HISTORICAL_HC_ENTITY.name(); ADTask task = ADTask .builder() .taskId(taskId) .taskType(taskType) - .detectorId(randomAlphaOfLength(5)) + .configId(randomAlphaOfLength(5)) .detector(detector) .state(state.name()) .taskProgress(0.5f) @@ -1406,6 +1427,33 @@ public static ADTask randomAdTask( return task; } + /** + * Generates a random Entity based on the provided configuration. + * + * If the configuration has multiple categories, a new Entity is created with attributes + * populated with random alphanumeric strings of length 5. + * + * If the configuration is marked as high cardinality and does not have multiple categories, + * a new Entity is created with a single attribute using the first category field and a random + * alphanumeric string of length 5. + * + * @param config The configuration object containing information about a time series analysis. + * @return A randomly generated Entity based on the configuration, or null if the config is null. + */ + public static Entity randomEntity(Config config) { + Entity entity = null; + if (config != null) { + if (config.hasMultipleCategories()) { + Map attrMap = new HashMap<>(); + config.getCategoryFields().stream().forEach(f -> attrMap.put(f, randomAlphaOfLength(5))); + entity = Entity.createEntityByReordering(attrMap); + } else if (config.isHighCardinality()) { + entity = Entity.createEntityByReordering(ImmutableMap.of(config.getCategoryFields().get(0), randomAlphaOfLength(5))); + } + } + return entity; + } + public static HttpEntity toHttpEntity(ToXContentObject object) throws IOException { return new StringEntity(toJsonString(object), APPLICATION_JSON); } @@ -1479,7 +1527,7 @@ public static Map parseStatsResult(String statsResult) throws IO public static DetectorValidationIssue randomDetectorValidationIssue() { DetectorValidationIssue issue = new DetectorValidationIssue( ValidationAspect.DETECTOR, - DetectorValidationIssueType.NAME, + ValidationIssueType.NAME, randomAlphaOfLength(5) ); return issue; @@ -1488,7 +1536,7 @@ public static DetectorValidationIssue randomDetectorValidationIssue() { public static DetectorValidationIssue randomDetectorValidationIssueWithSubIssues(Map subIssues) { DetectorValidationIssue issue = new DetectorValidationIssue( ValidationAspect.DETECTOR, - DetectorValidationIssueType.NAME, + ValidationIssueType.NAME, randomAlphaOfLength(5), subIssues, null @@ -1499,8 +1547,8 @@ public static DetectorValidationIssue randomDetectorValidationIssueWithSubIssues public static DetectorValidationIssue randomDetectorValidationIssueWithDetectorIntervalRec(long intervalRec) { DetectorValidationIssue issue = new DetectorValidationIssue( ValidationAspect.MODEL, - DetectorValidationIssueType.DETECTION_INTERVAL, - CommonErrorMessages.DETECTOR_INTERVAL_REC + intervalRec, + ValidationIssueType.DETECTION_INTERVAL, + CommonMessages.INTERVAL_REC + intervalRec, null, new IntervalTimeConfiguration(intervalRec, ChronoUnit.MINUTES) ); @@ -1509,9 +1557,10 @@ public static DetectorValidationIssue randomDetectorValidationIssueWithDetectorI public static ClusterState createClusterState() { final Map mappings = new HashMap<>(); + mappings .put( - ANOMALY_DETECTOR_JOB_INDEX, + CommonName.JOB_INDEX, IndexMetadata .builder("test") .settings( @@ -1523,7 +1572,11 @@ public static ClusterState createClusterState() { ) .build() ); - Metadata metaData = Metadata.builder().indices(mappings).build(); + + // The usage of Collections.unmodifiableMap is due to replacing ImmutableOpenMap + // with java.util.Map in the core (refer to https://tinyurl.com/5fjdccs3 and https://tinyurl.com/5fjdccs3) + // The meaning and logic of the code stay the same. + Metadata metaData = Metadata.builder().indices(Collections.unmodifiableMap(mappings)).build(); ClusterState clusterState = new ClusterState( new ClusterName("test_name"), 1l, @@ -1538,4 +1591,288 @@ public static ClusterState createClusterState() { ); return clusterState; } + + public static ImputationOption randomImputationOption() { + double[] defaultFill = DoubleStream.generate(OpenSearchTestCase::randomDouble).limit(10).toArray(); + ImputationOption fixedValue = new ImputationOption(ImputationMethod.FIXED_VALUES, Optional.of(defaultFill), false); + ImputationOption linear = new ImputationOption(ImputationMethod.LINEAR, Optional.of(defaultFill), false); + ImputationOption linearIntSensitive = new ImputationOption(ImputationMethod.LINEAR, Optional.of(defaultFill), true); + ImputationOption zero = new ImputationOption(ImputationMethod.ZERO); + ImputationOption previous = new ImputationOption(ImputationMethod.PREVIOUS); + + List options = List.of(fixedValue, linear, linearIntSensitive, zero, previous); + + // Select a random option + int randomIndex = Randomness.get().nextInt(options.size()); + return options.get(randomIndex); + } + + public static class ForecasterBuilder { + String forecasterId; + Long version; + String name; + String description; + String timeField; + List indices; + List features; + QueryBuilder filterQuery; + TimeConfiguration forecastInterval; + TimeConfiguration windowDelay; + Integer shingleSize; + Map uiMetadata; + Integer schemaVersion; + Instant lastUpdateTime; + List categoryFields; + User user; + String resultIndex; + Integer horizon; + ImputationOption imputationOption; + + ForecasterBuilder() throws IOException { + forecasterId = randomAlphaOfLength(10); + version = randomLong(); + name = randomAlphaOfLength(10); + description = randomAlphaOfLength(20); + timeField = randomAlphaOfLength(5); + indices = ImmutableList.of(randomAlphaOfLength(10)); + features = ImmutableList.of(randomFeature()); + filterQuery = randomQuery(); + forecastInterval = randomIntervalTimeConfiguration(); + windowDelay = randomIntervalTimeConfiguration(); + shingleSize = randomIntBetween(1, 20); + uiMetadata = ImmutableMap.of(randomAlphaOfLength(5), randomAlphaOfLength(10)); + schemaVersion = randomInt(); + lastUpdateTime = Instant.now().truncatedTo(ChronoUnit.SECONDS); + categoryFields = ImmutableList.of(randomAlphaOfLength(5)); + user = randomUser(); + resultIndex = null; + horizon = randomIntBetween(1, 20); + imputationOption = randomImputationOption(); + } + + public static ForecasterBuilder newInstance() throws IOException { + return new ForecasterBuilder(); + } + + public ForecasterBuilder setConfigId(String configId) { + this.forecasterId = configId; + return this; + } + + public ForecasterBuilder setVersion(Long version) { + this.version = version; + return this; + } + + public ForecasterBuilder setName(String name) { + this.name = name; + return this; + } + + public ForecasterBuilder setDescription(String description) { + this.description = description; + return this; + } + + public ForecasterBuilder setTimeField(String timeField) { + this.timeField = timeField; + return this; + } + + public ForecasterBuilder setIndices(List indices) { + this.indices = indices; + return this; + } + + public ForecasterBuilder setFeatureAttributes(List featureAttributes) { + this.features = featureAttributes; + return this; + } + + public ForecasterBuilder setFilterQuery(QueryBuilder filterQuery) { + this.filterQuery = filterQuery; + return this; + } + + public ForecasterBuilder setDetectionInterval(TimeConfiguration forecastInterval) { + this.forecastInterval = forecastInterval; + return this; + } + + public ForecasterBuilder setWindowDelay(TimeConfiguration windowDelay) { + this.windowDelay = windowDelay; + return this; + } + + public ForecasterBuilder setShingleSize(Integer shingleSize) { + this.shingleSize = shingleSize; + return this; + } + + public ForecasterBuilder setUiMetadata(Map uiMetadata) { + this.uiMetadata = uiMetadata; + return this; + } + + public ForecasterBuilder setSchemaVersion(Integer schemaVersion) { + this.schemaVersion = schemaVersion; + return this; + } + + public ForecasterBuilder setLastUpdateTime(Instant lastUpdateTime) { + this.lastUpdateTime = lastUpdateTime; + return this; + } + + public ForecasterBuilder setCategoryFields(List categoryFields) { + this.categoryFields = categoryFields; + return this; + } + + public ForecasterBuilder setUser(User user) { + this.user = user; + return this; + } + + public ForecasterBuilder setCustomResultIndex(String resultIndex) { + this.resultIndex = resultIndex; + return this; + } + + public ForecasterBuilder setNullImputationOption() { + this.imputationOption = null; + return this; + } + + public Forecaster build() { + return new Forecaster( + forecasterId, + version, + name, + description, + timeField, + indices, + features, + filterQuery, + forecastInterval, + windowDelay, + shingleSize, + uiMetadata, + schemaVersion, + lastUpdateTime, + categoryFields, + user, + resultIndex, + horizon, + imputationOption + ); + } + } + + public static Forecaster randomForecaster() throws IOException { + return new Forecaster( + randomAlphaOfLength(10), + randomLong(), + randomAlphaOfLength(10), + randomAlphaOfLength(20), + randomAlphaOfLength(5), + ImmutableList.of(randomAlphaOfLength(10)), + ImmutableList.of(randomFeature()), + randomQuery(), + randomIntervalTimeConfiguration(), + randomIntervalTimeConfiguration(), + randomIntBetween(1, 20), + ImmutableMap.of(randomAlphaOfLength(5), randomAlphaOfLength(10)), + randomInt(), + Instant.now().truncatedTo(ChronoUnit.SECONDS), + ImmutableList.of(randomAlphaOfLength(5)), + randomUser(), + null, + randomIntBetween(1, 20), + randomImputationOption() + ); + } + + public static class ForecastTaskBuilder { + private String configId = "config123"; + private String taskId = "task123"; + private String taskType = "FORECAST_HISTORICAL_HC_ENTITY"; + private String state = "Running"; + private Float taskProgress = 0.5f; + private Float initProgress = 0.1f; + private Instant currentPiece = Instant.now().truncatedTo(ChronoUnit.SECONDS); + private Instant executionStartTime = Instant.now().truncatedTo(ChronoUnit.SECONDS); + private Instant executionEndTime = Instant.now().truncatedTo(ChronoUnit.SECONDS); + private Boolean isLatest = true; + private String error = "No errors"; + private String checkpointId = "checkpoint1"; + private Instant lastUpdateTime = Instant.now().truncatedTo(ChronoUnit.SECONDS); + private String startedBy = "user1"; + private String stoppedBy = "user2"; + private String coordinatingNode = "node1"; + private String workerNode = "node2"; + private Forecaster forecaster = TestHelpers.randomForecaster(); + private Entity entity = TestHelpers.randomEntity(forecaster); + private String parentTaskId = "parentTask1"; + private Integer estimatedMinutesLeft = 10; + protected User user = TestHelpers.randomUser(); + + private DateRange dateRange = new DateRange(Instant.ofEpochMilli(123), Instant.ofEpochMilli(456)); + + public ForecastTaskBuilder() throws IOException { + forecaster = TestHelpers.randomForecaster(); + } + + public static ForecastTaskBuilder newInstance() throws IOException { + return new ForecastTaskBuilder(); + } + + public ForecastTaskBuilder setForecaster(Forecaster associatedForecaster) { + this.forecaster = associatedForecaster; + return this; + } + + public ForecastTaskBuilder setUser(User associatedUser) { + this.user = associatedUser; + return this; + } + + public ForecastTaskBuilder setDateRange(DateRange associatedRange) { + this.dateRange = associatedRange; + return this; + } + + public ForecastTaskBuilder setEntity(Entity associatedEntity) { + this.entity = associatedEntity; + return this; + } + + public ForecastTask build() { + return new ForecastTask.Builder() + .configId(configId) + .taskId(taskId) + .lastUpdateTime(lastUpdateTime) + .startedBy(startedBy) + .stoppedBy(stoppedBy) + .error(error) + .state(state) + .taskProgress(taskProgress) + .initProgress(initProgress) + .currentPiece(currentPiece) + .executionStartTime(executionStartTime) + .executionEndTime(executionEndTime) + .isLatest(isLatest) + .taskType(taskType) + .checkpointId(checkpointId) + .coordinatingNode(coordinatingNode) + .workerNode(workerNode) + .entity(entity) + .parentTaskId(parentTaskId) + .estimatedMinutesLeft(estimatedMinutesLeft) + .user(user) + .forecaster(forecaster) + .dateRange(dateRange) + .build(); + } + } } diff --git a/src/test/java/org/opensearch/ad/AnomalyDetectorPluginTests.java b/src/test/java/org/opensearch/timeseries/TimeSeriesPluginTests.java similarity index 87% rename from src/test/java/org/opensearch/ad/AnomalyDetectorPluginTests.java rename to src/test/java/org/opensearch/timeseries/TimeSeriesPluginTests.java index e152e0b72..ff170a1d5 100644 --- a/src/test/java/org/opensearch/ad/AnomalyDetectorPluginTests.java +++ b/src/test/java/org/opensearch/timeseries/TimeSeriesPluginTests.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad; +package org.opensearch.timeseries; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; @@ -17,6 +17,7 @@ import java.util.List; import org.apache.commons.pool2.impl.GenericObjectPool; +import org.opensearch.ad.ADUnitTestCase; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.ClusterSettings; @@ -26,13 +27,13 @@ import io.protostuff.LinkedBuffer; -public class AnomalyDetectorPluginTests extends ADUnitTestCase { - AnomalyDetectorPlugin plugin; +public class TimeSeriesPluginTests extends ADUnitTestCase { + TimeSeriesAnalyticsPlugin plugin; @Override public void setUp() throws Exception { super.setUp(); - plugin = new AnomalyDetectorPlugin(); + plugin = new TimeSeriesAnalyticsPlugin(); } @Override @@ -42,7 +43,7 @@ public void tearDown() throws Exception { } /** - * We have legacy setting. AnomalyDetectorPlugin's createComponents can trigger + * We have legacy setting. TimeSeriesAnalyticsPlugin's createComponents can trigger * warning when using these legacy settings. */ @Override @@ -79,7 +80,7 @@ public void testDeserializeRCFBufferPool() throws Exception { } public void testOverriddenJobTypeAndIndex() { - assertEquals("opendistro_anomaly_detector", plugin.getJobType()); + assertEquals("opensearch_time_series_analytics", plugin.getJobType()); assertEquals(".opendistro-anomaly-detector-jobs", plugin.getJobIndex()); } diff --git a/src/test/java/org/opensearch/timeseries/common/exception/ValidationExceptionTests.java b/src/test/java/org/opensearch/timeseries/common/exception/ValidationExceptionTests.java new file mode 100644 index 000000000..bfcd5ad7a --- /dev/null +++ b/src/test/java/org/opensearch/timeseries/common/exception/ValidationExceptionTests.java @@ -0,0 +1,51 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + * + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.timeseries.common.exception; + +import org.opensearch.forecast.constant.ForecastCommonName; +import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.model.ValidationAspect; +import org.opensearch.timeseries.model.ValidationIssueType; + +public class ValidationExceptionTests extends OpenSearchTestCase { + public void testConstructorDetector() { + String message = randomAlphaOfLength(5); + ValidationException exception = new ValidationException(message, ValidationIssueType.NAME, ValidationAspect.DETECTOR); + assertEquals(ValidationIssueType.NAME, exception.getType()); + assertEquals(ValidationAspect.DETECTOR, exception.getAspect()); + } + + public void testConstructorModel() { + String message = randomAlphaOfLength(5); + ValidationException exception = new ValidationException(message, ValidationIssueType.CATEGORY, ValidationAspect.MODEL); + assertEquals(ValidationIssueType.CATEGORY, exception.getType()); + assertEquals(ValidationAspect.getName(CommonName.MODEL_ASPECT), exception.getAspect()); + } + + public void testToString() { + String message = randomAlphaOfLength(5); + ValidationException exception = new ValidationException(message, ValidationIssueType.NAME, ValidationAspect.DETECTOR); + String exceptionString = exception.toString(); + logger.info("exception string: " + exceptionString); + ValidationException exceptionNoType = new ValidationException(message, ValidationIssueType.NAME, null); + String exceptionStringNoType = exceptionNoType.toString(); + logger.info("exception string no type: " + exceptionStringNoType); + } + + public void testForecasterAspect() { + String message = randomAlphaOfLength(5); + ValidationException exception = new ValidationException(message, ValidationIssueType.CATEGORY, ValidationAspect.FORECASTER); + assertEquals(ValidationIssueType.CATEGORY, exception.getType()); + assertEquals(ValidationAspect.getName(ForecastCommonName.FORECASTER_ASPECT), exception.getAspect()); + } +} diff --git a/src/test/java/org/opensearch/timeseries/dataprocessor/FixedValueImputerTests.java b/src/test/java/org/opensearch/timeseries/dataprocessor/FixedValueImputerTests.java new file mode 100644 index 000000000..81b9b5bfb --- /dev/null +++ b/src/test/java/org/opensearch/timeseries/dataprocessor/FixedValueImputerTests.java @@ -0,0 +1,34 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.dataprocessor; + +import static org.junit.Assert.assertArrayEquals; + +import org.junit.Test; + +public class FixedValueImputerTests { + + @Test + public void testImpute() { + // Initialize the FixedValueImputer with some fixed values + double[] fixedValues = { 2.0, 3.0 }; + FixedValueImputer imputer = new FixedValueImputer(fixedValues); + + // Create a sample array with some missing values (Double.NaN) + double[][] samples = { { 1.0, Double.NaN, 3.0 }, { Double.NaN, 2.0, 3.0 } }; + + // Call the impute method + double[][] imputed = imputer.impute(samples, 3); + + // Check the results + double[][] expected = { { 1.0, 2.0, 3.0 }, { 3.0, 2.0, 3.0 } }; + double delta = 0.0001; + + for (int i = 0; i < expected.length; i++) { + assertArrayEquals("The arrays are not equal", expected[i], imputed[i], delta); + } + } +} diff --git a/src/test/java/org/opensearch/timeseries/dataprocessor/ImputationOptionTests.java b/src/test/java/org/opensearch/timeseries/dataprocessor/ImputationOptionTests.java new file mode 100644 index 000000000..9adb57ed9 --- /dev/null +++ b/src/test/java/org/opensearch/timeseries/dataprocessor/ImputationOptionTests.java @@ -0,0 +1,124 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.dataprocessor; + +import java.io.IOException; +import java.util.Optional; + +import org.opensearch.common.io.stream.BytesStreamOutput; +import org.opensearch.common.xcontent.json.JsonXContent; +import org.opensearch.core.common.bytes.BytesReference; +import org.opensearch.core.common.io.stream.StreamInput; +import org.opensearch.core.xcontent.DeprecationHandler; +import org.opensearch.core.xcontent.NamedXContentRegistry; +import org.opensearch.core.xcontent.ToXContent; +import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.test.OpenSearchTestCase; + +public class ImputationOptionTests extends OpenSearchTestCase { + + public void testStreamInputAndOutput() throws IOException { + // Prepare the data to be read by the StreamInput object. + ImputationMethod method = ImputationMethod.PREVIOUS; + double[] defaultFill = { 1.0, 2.0, 3.0 }; + + ImputationOption option = new ImputationOption(method, Optional.of(defaultFill), false); + + // Write the ImputationOption to the StreamOutput. + BytesStreamOutput out = new BytesStreamOutput(); + option.writeTo(out); + + StreamInput in = out.bytes().streamInput(); + + // Create an ImputationOption using the mocked StreamInput. + ImputationOption inOption = new ImputationOption(in); + + // Check that the created ImputationOption has the correct values. + assertEquals(method, inOption.getMethod()); + assertArrayEquals(defaultFill, inOption.getDefaultFill().get(), 1e-6); + } + + public void testToXContent() throws IOException { + double[] defaultFill = { 1.0, 2.0, 3.0 }; + ImputationOption imputationOption = new ImputationOption(ImputationMethod.FIXED_VALUES, Optional.of(defaultFill), false); + + String xContent = "{" + "\"method\":\"FIXED_VALUES\"," + "\"defaultFill\":[1.0,2.0,3.0],\"integerSensitive\":false" + "}"; + + XContentBuilder builder = imputationOption.toXContent(JsonXContent.contentBuilder(), ToXContent.EMPTY_PARAMS); + String actualJson = BytesReference.bytes(builder).utf8ToString(); + + assertEquals(xContent, actualJson); + } + + public void testParse() throws IOException { + String xContent = "{" + "\"method\":\"FIXED_VALUES\"," + "\"defaultFill\":[1.0,2.0,3.0],\"integerSensitive\":false" + "}"; + + double[] defaultFill = { 1.0, 2.0, 3.0 }; + ImputationOption imputationOption = new ImputationOption(ImputationMethod.FIXED_VALUES, Optional.of(defaultFill), false); + + try ( + XContentParser parser = JsonXContent.jsonXContent + .createParser(NamedXContentRegistry.EMPTY, DeprecationHandler.THROW_UNSUPPORTED_OPERATION, xContent) + ) { + // advance to first token + XContentParser.Token token = parser.nextToken(); + if (token != XContentParser.Token.START_OBJECT) { + throw new IOException("Expected data to start with an Object"); + } + + ImputationOption parsedOption = ImputationOption.parse(parser); + + assertEquals(imputationOption.getMethod(), parsedOption.getMethod()); + assertTrue(imputationOption.getDefaultFill().isPresent()); + assertTrue(parsedOption.getDefaultFill().isPresent()); + assertEquals(imputationOption.getDefaultFill().get().length, parsedOption.getDefaultFill().get().length); + for (int i = 0; i < imputationOption.getDefaultFill().get().length; i++) { + assertEquals(imputationOption.getDefaultFill().get()[i], parsedOption.getDefaultFill().get()[i], 0); + } + } + } + + public void testEqualsAndHashCode() { + double[] defaultFill1 = { 1.0, 2.0, 3.0 }; + double[] defaultFill2 = { 4.0, 5.0, 6.0 }; + + ImputationOption option1 = new ImputationOption(ImputationMethod.FIXED_VALUES, Optional.of(defaultFill1), false); + ImputationOption option2 = new ImputationOption(ImputationMethod.FIXED_VALUES, Optional.of(defaultFill1), false); + ImputationOption option3 = new ImputationOption(ImputationMethod.LINEAR, Optional.of(defaultFill2), false); + + // Test reflexivity + assertTrue(option1.equals(option1)); + + // Test symmetry + assertTrue(option1.equals(option2)); + assertTrue(option2.equals(option1)); + + // Test transitivity + ImputationOption option2Clone = new ImputationOption(ImputationMethod.FIXED_VALUES, Optional.of(defaultFill1), false); + assertTrue(option1.equals(option2)); + assertTrue(option2.equals(option2Clone)); + assertTrue(option1.equals(option2Clone)); + + // Test consistency: ultiple invocations of a.equals(b) consistently return true or consistently return false. + assertTrue(option1.equals(option2)); + assertTrue(option1.equals(option2)); + + // Test non-nullity + assertFalse(option1.equals(null)); + + // Test hashCode consistency + assertEquals(option1.hashCode(), option1.hashCode()); + + // Test hashCode equality + assertTrue(option1.equals(option2)); + assertEquals(option1.hashCode(), option2.hashCode()); + + // Test inequality + assertFalse(option1.equals(option3)); + assertNotEquals(option1.hashCode(), option3.hashCode()); + } +} diff --git a/src/test/java/org/opensearch/timeseries/dataprocessor/IntegerSensitiveLinearUniformImputerTests.java b/src/test/java/org/opensearch/timeseries/dataprocessor/IntegerSensitiveLinearUniformImputerTests.java new file mode 100644 index 000000000..03e8b6cb3 --- /dev/null +++ b/src/test/java/org/opensearch/timeseries/dataprocessor/IntegerSensitiveLinearUniformImputerTests.java @@ -0,0 +1,72 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.dataprocessor; + +import static org.junit.Assert.assertArrayEquals; + +import java.util.Arrays; +import java.util.Collection; + +import org.junit.Before; +import org.junit.Test; +import org.junit.runner.RunWith; +import org.junit.runners.Parameterized; +import org.junit.runners.Parameterized.Parameters; + +/** + * Compared to MultiFeatureLinearUniformImputerTests, outputs are different and + * integerSensitive is enabled + * + */ +@RunWith(Parameterized.class) +public class IntegerSensitiveLinearUniformImputerTests { + + @Parameters + public static Collection data() { + double[][] singleComponent = { { -1.0, 2.0 }, { 1.0, 1.0 } }; + double[][] multiComponent = { { 0.0, 1.0, -1.0 }, { 1.0, 1.0, 1.0 } }; + + return Arrays + .asList( + new Object[][] { + // after integer sensitive rint rounding + { singleComponent, 2, singleComponent }, + { singleComponent, 3, new double[][] { { -1.0, 0, 2.0 }, { 1.0, 1.0, 1.0 } } }, + { singleComponent, 4, new double[][] { { -1.0, 0.0, 1.0, 2.0 }, { 1.0, 1.0, 1.0, 1.0 } } }, + { multiComponent, 3, multiComponent }, + { multiComponent, 4, new double[][] { { 0.0, 1.0, 0.0, -1.0 }, { 1.0, 1.0, 1.0, 1.0 } } }, + { multiComponent, 5, new double[][] { { 0.0, 0.0, 1.0, 0.0, -1.0 }, { 1.0, 1.0, 1.0, 1.0, 1.0 } } }, + { multiComponent, 6, new double[][] { { 0.0, 0.0, 1.0, 1.0, -0.0, -1.0 }, { 1.0, 1.0, 1.0, 1.0, 1.0, 1.0 } } }, } + ); + } + + private double[][] input; + private int numInterpolants; + private double[][] expected; + private Imputer imputer; + + public IntegerSensitiveLinearUniformImputerTests(double[][] input, int numInterpolants, double[][] expected) { + this.input = input; + this.numInterpolants = numInterpolants; + this.expected = expected; + } + + @Before + public void setUp() { + this.imputer = new LinearUniformImputer(true); + } + + @Test + public void testImputation() { + double[][] actual = imputer.impute(input, numInterpolants); + double delta = 1e-8; + int numFeatures = expected.length; + + for (int i = 0; i < numFeatures; i++) { + assertArrayEquals(expected[i], actual[i], delta); + } + } +} diff --git a/src/test/java/org/opensearch/ad/dataprocessor/LinearUniformInterpolatorTests.java b/src/test/java/org/opensearch/timeseries/dataprocessor/MultiFeatureLinearUniformImputerTests.java similarity index 81% rename from src/test/java/org/opensearch/ad/dataprocessor/LinearUniformInterpolatorTests.java rename to src/test/java/org/opensearch/timeseries/dataprocessor/MultiFeatureLinearUniformImputerTests.java index 2d39566ca..3656be278 100644 --- a/src/test/java/org/opensearch/ad/dataprocessor/LinearUniformInterpolatorTests.java +++ b/src/test/java/org/opensearch/timeseries/dataprocessor/MultiFeatureLinearUniformImputerTests.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.dataprocessor; +package org.opensearch.timeseries.dataprocessor; import static org.junit.Assert.assertArrayEquals; @@ -23,7 +23,7 @@ import org.junit.runners.Parameterized.Parameters; @RunWith(Parameterized.class) -public class LinearUniformInterpolatorTests { +public class MultiFeatureLinearUniformImputerTests { @Parameters public static Collection data() { @@ -34,6 +34,7 @@ public static Collection data() { return Arrays .asList( new Object[][] { + // no integer sensitive rint rounding at the end of singleFeatureImpute. { singleComponent, 2, singleComponent }, { singleComponent, 3, new double[][] { { -1.0, 0.5, 2.0 }, { 1.0, 1.0, 1.0 } } }, { singleComponent, 4, new double[][] { { -1.0, 0.0, 1.0, 2.0 }, { 1.0, 1.0, 1.0, 1.0 } } }, @@ -47,9 +48,9 @@ public static Collection data() { private double[][] input; private int numInterpolants; private double[][] expected; - private LinearUniformInterpolator interpolator; + private Imputer imputer; - public LinearUniformInterpolatorTests(double[][] input, int numInterpolants, double[][] expected) { + public MultiFeatureLinearUniformImputerTests(double[][] input, int numInterpolants, double[][] expected) { this.input = input; this.numInterpolants = numInterpolants; this.expected = expected; @@ -57,12 +58,12 @@ public LinearUniformInterpolatorTests(double[][] input, int numInterpolants, dou @Before public void setUp() { - this.interpolator = new LinearUniformInterpolator(new SingleFeatureLinearUniformInterpolator()); + this.imputer = new LinearUniformImputer(false); } @Test - public void testInterpolation() { - double[][] actual = interpolator.interpolate(input, numInterpolants); + public void testImputation() { + double[][] actual = imputer.impute(input, numInterpolants); double delta = 1e-8; int numFeatures = expected.length; diff --git a/src/test/java/org/opensearch/timeseries/dataprocessor/PreviousValueImputerTests.java b/src/test/java/org/opensearch/timeseries/dataprocessor/PreviousValueImputerTests.java new file mode 100644 index 000000000..fb39d83f2 --- /dev/null +++ b/src/test/java/org/opensearch/timeseries/dataprocessor/PreviousValueImputerTests.java @@ -0,0 +1,27 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.dataprocessor; + +import java.util.Arrays; + +import org.opensearch.test.OpenSearchTestCase; + +public class PreviousValueImputerTests extends OpenSearchTestCase { + public void testSingleFeatureImpute() { + PreviousValueImputer imputer = new PreviousValueImputer(); + + double[] samples = { 1.0, Double.NaN, 3.0, Double.NaN, 5.0 }; + double[] expected = { 1.0, 1.0, 3.0, 3.0, 5.0 }; + + assertTrue("Imputation failed", Arrays.equals(expected, imputer.singleFeatureImpute(samples, 0))); + + // The second test checks whether the method removes leading Double.NaN values from the array + samples = new double[] { Double.NaN, 2.0, Double.NaN, 4.0 }; + expected = new double[] { Double.NaN, 2.0, 2.0, 4.0 }; + + assertTrue("Imputation failed with leading NaN", Arrays.equals(expected, imputer.singleFeatureImpute(samples, 0))); + } +} diff --git a/src/test/java/org/opensearch/ad/dataprocessor/IntegerSensitiveSingleFeatureLinearUniformInterpolatorTests.java b/src/test/java/org/opensearch/timeseries/dataprocessor/SingleFeatureLinearUniformImputerTests.java similarity index 54% rename from src/test/java/org/opensearch/ad/dataprocessor/IntegerSensitiveSingleFeatureLinearUniformInterpolatorTests.java rename to src/test/java/org/opensearch/timeseries/dataprocessor/SingleFeatureLinearUniformImputerTests.java index 64324a0e4..0bf7bdffb 100644 --- a/src/test/java/org/opensearch/ad/dataprocessor/IntegerSensitiveSingleFeatureLinearUniformInterpolatorTests.java +++ b/src/test/java/org/opensearch/timeseries/dataprocessor/SingleFeatureLinearUniformImputerTests.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.dataprocessor; +package org.opensearch.timeseries.dataprocessor; import static org.junit.Assert.assertTrue; @@ -23,25 +23,28 @@ import junitparams.Parameters; @RunWith(JUnitParamsRunner.class) -public class IntegerSensitiveSingleFeatureLinearUniformInterpolatorTests { +public class SingleFeatureLinearUniformImputerTests { - private IntegerSensitiveSingleFeatureLinearUniformInterpolator interpolator; + private Imputer imputer; @Before public void setup() { - interpolator = new IntegerSensitiveSingleFeatureLinearUniformInterpolator(); + imputer = new LinearUniformImputer(false); } - private Object[] interpolateData() { + private Object[] imputeData() { return new Object[] { new Object[] { new double[] { 25.25, 25.75 }, 3, new double[] { 25.25, 25.5, 25.75 } }, new Object[] { new double[] { 25, 75 }, 3, new double[] { 25, 50, 75 } }, - new Object[] { new double[] { 25, 75.5 }, 3, new double[] { 25, 50.25, 75.5 } }, }; + new Object[] { new double[] { 25, 75.5 }, 3, new double[] { 25, 50.25, 75.5 } }, + new Object[] { new double[] { 25.25, 25.75 }, 3, new double[] { 25.25, 25.5, 25.75 } }, + new Object[] { new double[] { 25, 75 }, 3, new double[] { 25, 50, 75 } }, + new Object[] { new double[] { 25, 75.5 }, 3, new double[] { 25, 50.25, 75.5 } } }; } @Test - @Parameters(method = "interpolateData") - public void interpolate_returnExpected(double[] samples, int num, double[] expected) { - assertTrue(Arrays.equals(expected, interpolator.interpolate(samples, num))); + @Parameters(method = "imputeData") + public void impute_returnExpected(double[] samples, int num, double[] expected) { + assertTrue(Arrays.equals(expected, imputer.singleFeatureImpute(samples, num))); } } diff --git a/src/test/java/org/opensearch/timeseries/dataprocessor/ZeroImputerTests.java b/src/test/java/org/opensearch/timeseries/dataprocessor/ZeroImputerTests.java new file mode 100644 index 000000000..8e03821e2 --- /dev/null +++ b/src/test/java/org/opensearch/timeseries/dataprocessor/ZeroImputerTests.java @@ -0,0 +1,39 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.dataprocessor; + +import static org.junit.Assert.assertArrayEquals; + +import org.junit.Before; +import org.junit.Test; +import org.junit.runner.RunWith; + +import junitparams.JUnitParamsRunner; +import junitparams.Parameters; + +@RunWith(JUnitParamsRunner.class) +public class ZeroImputerTests { + + private Imputer imputer; + + @Before + public void setup() { + imputer = new ZeroImputer(); + } + + private Object[] imputeData() { + return new Object[] { + new Object[] { new double[] { 25.25, Double.NaN, 25.75 }, 3, new double[] { 25.25, 0, 25.75 } }, + new Object[] { new double[] { Double.NaN, 25, 75 }, 3, new double[] { 0, 25, 75 } }, + new Object[] { new double[] { 25, 75.5, Double.NaN }, 3, new double[] { 25, 75.5, 0 } }, }; + } + + @Test + @Parameters(method = "imputeData") + public void impute_returnExpected(double[] samples, int num, double[] expected) { + assertArrayEquals("The arrays are not equal", expected, imputer.singleFeatureImpute(samples, num), 0.001); + } +} diff --git a/src/test/java/org/opensearch/ad/feature/NoPowermockSearchFeatureDaoTests.java b/src/test/java/org/opensearch/timeseries/feature/NoPowermockSearchFeatureDaoTests.java similarity index 93% rename from src/test/java/org/opensearch/ad/feature/NoPowermockSearchFeatureDaoTests.java rename to src/test/java/org/opensearch/timeseries/feature/NoPowermockSearchFeatureDaoTests.java index bb2b97bee..aa8ba932c 100644 --- a/src/test/java/org/opensearch/ad/feature/NoPowermockSearchFeatureDaoTests.java +++ b/src/test/java/org/opensearch/timeseries/feature/NoPowermockSearchFeatureDaoTests.java @@ -9,10 +9,12 @@ * GitHub history for details. */ -package org.opensearch.ad.feature; +package org.opensearch.timeseries.feature; import static org.hamcrest.core.IsInstanceOf.instanceOf; +import static org.junit.Assert.assertTrue; import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.eq; import static org.mockito.Mockito.doAnswer; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.verify; @@ -54,17 +56,8 @@ import org.opensearch.action.search.SearchResponse.Clusters; import org.opensearch.action.search.SearchResponseSections; import org.opensearch.action.search.ShardSearchFailure; -import org.opensearch.ad.AbstractADTest; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.TestHelpers; -import org.opensearch.ad.dataprocessor.LinearUniformInterpolator; -import org.opensearch.ad.dataprocessor.SingleFeatureLinearUniformInterpolator; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.Feature; -import org.opensearch.ad.model.IntervalTimeConfiguration; import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.lease.Releasables; @@ -102,6 +95,17 @@ import org.opensearch.search.aggregations.metrics.InternalMax; import org.opensearch.search.aggregations.metrics.SumAggregationBuilder; import org.opensearch.search.internal.InternalSearchResponse; +import org.opensearch.timeseries.AbstractTimeSeriesTest; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TestHelpers; +import org.opensearch.timeseries.dataprocessor.Imputer; +import org.opensearch.timeseries.dataprocessor.LinearUniformImputer; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.Feature; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.SecurityClientUtil; import com.google.common.collect.ImmutableList; @@ -110,13 +114,13 @@ * Create a new class for new tests related to SearchFeatureDao. * */ -public class NoPowermockSearchFeatureDaoTests extends AbstractADTest { +public class NoPowermockSearchFeatureDaoTests extends AbstractTimeSeriesTest { private final Logger LOG = LogManager.getLogger(NoPowermockSearchFeatureDaoTests.class); private AnomalyDetector detector; private Client client; private SearchFeatureDao searchFeatureDao; - private LinearUniformInterpolator interpolator; + private Imputer imputer; private SecurityClientUtil clientUtil; private Settings settings; private ClusterService clusterService; @@ -142,27 +146,27 @@ public void setUp() throws Exception { hostField = "host"; detector = mock(AnomalyDetector.class); - when(detector.isMultientityDetector()).thenReturn(true); - when(detector.getCategoryField()).thenReturn(Arrays.asList(new String[] { serviceField, hostField })); + when(detector.isHighCardinality()).thenReturn(true); + when(detector.getCategoryFields()).thenReturn(Arrays.asList(new String[] { serviceField, hostField })); detectorId = "123"; - when(detector.getDetectorId()).thenReturn(detectorId); + when(detector.getId()).thenReturn(detectorId); when(detector.getTimeField()).thenReturn("testTimeField"); when(detector.getIndices()).thenReturn(Arrays.asList("testIndices")); IntervalTimeConfiguration detectionInterval = new IntervalTimeConfiguration(1, ChronoUnit.MINUTES); - when(detector.getDetectionInterval()).thenReturn(detectionInterval); + when(detector.getInterval()).thenReturn(detectionInterval); when(detector.getFilterQuery()).thenReturn(QueryBuilders.matchAllQuery()); client = mock(Client.class); when(client.threadPool()).thenReturn(threadPool); - interpolator = new LinearUniformInterpolator(new SingleFeatureLinearUniformInterpolator()); + imputer = new LinearUniformImputer(false); settings = Settings.EMPTY; ClusterSettings clusterSettings = new ClusterSettings( Settings.EMPTY, Collections .unmodifiableSet( - new HashSet<>(Arrays.asList(AnomalyDetectorSettings.MAX_ENTITIES_FOR_PREVIEW, AnomalyDetectorSettings.PAGE_SIZE)) + new HashSet<>(Arrays.asList(AnomalyDetectorSettings.MAX_ENTITIES_FOR_PREVIEW, AnomalyDetectorSettings.AD_PAGE_SIZE)) ) ); clusterService = mock(ClusterService.class); @@ -170,20 +174,20 @@ public void setUp() throws Exception { clock = mock(Clock.class); NodeStateManager nodeStateManager = mock(NodeStateManager.class); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); clientUtil = new SecurityClientUtil(nodeStateManager, settings); searchFeatureDao = new SearchFeatureDao( client, xContentRegistry(), // Important. Without this, ParseUtils cannot parse anything - interpolator, + imputer, clientUtil, settings, clusterService, - AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE, + TimeSeriesSettings.NUM_SAMPLES_PER_TREE, clock, 1, 1, @@ -296,7 +300,7 @@ public void testGetHighestCountEntitiesUsingTermsAgg() { }).when(client).search(any(SearchRequest.class), any(ActionListener.class)); String categoryField = "fieldName"; - when(detector.getCategoryField()).thenReturn(Collections.singletonList(categoryField)); + when(detector.getCategoryFields()).thenReturn(Collections.singletonList(categoryField)); ActionListener> listener = mock(ActionListener.class); searchFeatureDao.getHighestCountEntities(detector, 10L, 20L, listener); @@ -366,11 +370,11 @@ public void testGetHighestCountEntitiesExhaustedPages() throws InterruptedExcept searchFeatureDao = new SearchFeatureDao( client, xContentRegistry(), - interpolator, + imputer, clientUtil, settings, clusterService, - AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE, + TimeSeriesSettings.NUM_SAMPLES_PER_TREE, clock, 2, 1, @@ -412,11 +416,11 @@ public void testGetHighestCountEntitiesNotEnoughTime() throws InterruptedExcepti searchFeatureDao = new SearchFeatureDao( client, xContentRegistry(), - interpolator, + imputer, clientUtil, settings, clusterService, - AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE, + TimeSeriesSettings.NUM_SAMPLES_PER_TREE, clock, 2, 1, @@ -526,8 +530,9 @@ public void getColdStartSamplesForPeriodsTemplate(DocValueFormat format) throws .getColdStartSamplesForPeriods( detector, sampleRanges, - Entity.createSingleAttributeEntity("field", "abc"), + Optional.of(Entity.createSingleAttributeEntity("field", "abc")), true, + AnalysisType.AD, ActionListener.wrap(samples -> { assertEquals(3, samples.size()); for (int i = 0; i < samples.size(); i++) { @@ -559,8 +564,9 @@ public void getColdStartSamplesForPeriodsTemplate(DocValueFormat format) throws .getColdStartSamplesForPeriods( detector, sampleRanges, - Entity.createSingleAttributeEntity("field", "abc"), + Optional.of(Entity.createSingleAttributeEntity("field", "abc")), false, + AnalysisType.AD, ActionListener.wrap(samples -> { assertEquals(2, samples.size()); for (int i = 0; i < samples.size(); i++) { diff --git a/src/test/java/org/opensearch/ad/feature/SearchFeatureDaoParamTests.java b/src/test/java/org/opensearch/timeseries/feature/SearchFeatureDaoParamTests.java similarity index 90% rename from src/test/java/org/opensearch/ad/feature/SearchFeatureDaoParamTests.java rename to src/test/java/org/opensearch/timeseries/feature/SearchFeatureDaoParamTests.java index 0974be907..6d67b3f72 100644 --- a/src/test/java/org/opensearch/ad/feature/SearchFeatureDaoParamTests.java +++ b/src/test/java/org/opensearch/timeseries/feature/SearchFeatureDaoParamTests.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.feature; +package org.opensearch.timeseries.feature; import static java.util.Arrays.asList; import static org.junit.Assert.assertEquals; @@ -47,22 +47,11 @@ import org.opensearch.action.search.MultiSearchResponse.Item; import org.opensearch.action.search.SearchRequest; import org.opensearch.action.search.SearchResponse; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.dataprocessor.Interpolator; -import org.opensearch.ad.dataprocessor.LinearUniformInterpolator; -import org.opensearch.ad.dataprocessor.SingleFeatureLinearUniformInterpolator; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.IntervalTimeConfiguration; -import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.action.ActionFuture; import org.opensearch.common.settings.Settings; -import org.opensearch.common.xcontent.LoggingDeprecationHandler; -import org.opensearch.common.xcontent.XContentType; import org.opensearch.core.action.ActionListener; import org.opensearch.core.xcontent.NamedXContentRegistry; import org.opensearch.index.query.QueryBuilders; @@ -76,8 +65,16 @@ import org.opensearch.search.aggregations.metrics.InternalTDigestPercentiles; import org.opensearch.search.aggregations.metrics.Max; import org.opensearch.search.aggregations.metrics.Percentile; -import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.dataprocessor.Imputer; +import org.opensearch.timeseries.dataprocessor.LinearUniformImputer; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.SecurityClientUtil; import junitparams.JUnitParamsRunner; import junitparams.Parameters; @@ -135,21 +132,21 @@ public class SearchFeatureDaoParamTests { private Clock clock; private SearchRequest searchRequest; - private SearchSourceBuilder searchSourceBuilder; private MultiSearchRequest multiSearchRequest; private IntervalTimeConfiguration detectionInterval; private String detectorId; - private Interpolator interpolator; + private Imputer imputer; private Settings settings; @Before public void setup() throws Exception { MockitoAnnotations.initMocks(this); + // PowerMockito.mockStatic(ParseUtils.class); - interpolator = new LinearUniformInterpolator(new SingleFeatureLinearUniformInterpolator()); + imputer = new LinearUniformImputer(false); ExecutorService executorService = mock(ExecutorService.class); - when(threadPool.executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); + when(threadPool.executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); doAnswer(invocation -> { Runnable runnable = invocation.getArgument(0); runnable.run(); @@ -161,27 +158,25 @@ public void setup() throws Exception { when(client.threadPool()).thenReturn(threadPool); NodeStateManager nodeStateManager = mock(NodeStateManager.class); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); clientUtil = new SecurityClientUtil(nodeStateManager, settings); searchFeatureDao = spy( - new SearchFeatureDao(client, xContent, interpolator, clientUtil, settings, null, AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE) + new SearchFeatureDao(client, xContent, imputer, clientUtil, settings, null, TimeSeriesSettings.NUM_SAMPLES_PER_TREE) ); detectionInterval = new IntervalTimeConfiguration(1, ChronoUnit.MINUTES); detectorId = "123"; - when(detector.getDetectorId()).thenReturn(detectorId); + when(detector.getId()).thenReturn(detectorId); when(detector.getTimeField()).thenReturn("testTimeField"); when(detector.getIndices()).thenReturn(Arrays.asList("testIndices")); - when(detector.getDetectionInterval()).thenReturn(detectionInterval); + when(detector.getInterval()).thenReturn(detectionInterval); when(detector.getFilterQuery()).thenReturn(QueryBuilders.matchAllQuery()); - when(detector.getCategoryField()).thenReturn(Collections.singletonList("a")); + when(detector.getCategoryFields()).thenReturn(Collections.singletonList("a")); - searchSourceBuilder = SearchSourceBuilder - .fromXContent(XContentType.JSON.xContent().createParser(xContent, LoggingDeprecationHandler.INSTANCE, "{}")); searchRequest = new SearchRequest(detector.getIndices().toArray(new String[0])); when(max.getName()).thenReturn(CommonName.AGG_NAME_MAX_TIME); diff --git a/src/test/java/org/opensearch/ad/feature/SearchFeatureDaoTests.java b/src/test/java/org/opensearch/timeseries/feature/SearchFeatureDaoTests.java similarity index 91% rename from src/test/java/org/opensearch/ad/feature/SearchFeatureDaoTests.java rename to src/test/java/org/opensearch/timeseries/feature/SearchFeatureDaoTests.java index a4142cf42..9731d31b5 100644 --- a/src/test/java/org/opensearch/ad/feature/SearchFeatureDaoTests.java +++ b/src/test/java/org/opensearch/timeseries/feature/SearchFeatureDaoTests.java @@ -9,7 +9,7 @@ * GitHub history for details. */ -package org.opensearch.ad.feature; +package org.opensearch.timeseries.feature; import static java.util.Arrays.asList; import static java.util.Collections.emptyMap; @@ -55,17 +55,7 @@ import org.opensearch.action.search.SearchResponse; import org.opensearch.action.search.SearchResponseSections; import org.opensearch.action.search.ShardSearchFailure; -import org.opensearch.ad.AnomalyDetectorPlugin; -import org.opensearch.ad.NodeStateManager; -import org.opensearch.ad.constant.CommonName; -import org.opensearch.ad.dataprocessor.Interpolator; -import org.opensearch.ad.dataprocessor.LinearUniformInterpolator; -import org.opensearch.ad.dataprocessor.SingleFeatureLinearUniformInterpolator; import org.opensearch.ad.model.AnomalyDetector; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.model.IntervalTimeConfiguration; -import org.opensearch.ad.settings.AnomalyDetectorSettings; -import org.opensearch.ad.util.SecurityClientUtil; import org.opensearch.client.Client; import org.opensearch.common.action.ActionFuture; import org.opensearch.common.settings.Settings; @@ -98,6 +88,16 @@ import org.opensearch.search.aggregations.metrics.Percentile; import org.opensearch.search.builder.SearchSourceBuilder; import org.opensearch.threadpool.ThreadPool; +import org.opensearch.timeseries.AnalysisType; +import org.opensearch.timeseries.NodeStateManager; +import org.opensearch.timeseries.TimeSeriesAnalyticsPlugin; +import org.opensearch.timeseries.constant.CommonName; +import org.opensearch.timeseries.dataprocessor.Imputer; +import org.opensearch.timeseries.dataprocessor.LinearUniformImputer; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.model.IntervalTimeConfiguration; +import org.opensearch.timeseries.settings.TimeSeriesSettings; +import org.opensearch.timeseries.util.SecurityClientUtil; public class SearchFeatureDaoTests { private SearchFeatureDao searchFeatureDao; @@ -146,17 +146,18 @@ public class SearchFeatureDaoTests { private Map aggsMap; private IntervalTimeConfiguration detectionInterval; private String detectorId; - private Interpolator interpolator; + private Imputer imputer; private Settings settings; @Before public void setup() throws Exception { MockitoAnnotations.initMocks(this); + // PowerMockito.mockStatic(ParseUtils.class); - interpolator = new LinearUniformInterpolator(new SingleFeatureLinearUniformInterpolator()); + imputer = new LinearUniformImputer(false); ExecutorService executorService = mock(ExecutorService.class); - when(threadPool.executor(AnomalyDetectorPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); + when(threadPool.executor(TimeSeriesAnalyticsPlugin.AD_THREAD_POOL_NAME)).thenReturn(executorService); doAnswer(invocation -> { Runnable runnable = invocation.getArgument(0); runnable.run(); @@ -168,24 +169,24 @@ public void setup() throws Exception { when(client.threadPool()).thenReturn(threadPool); NodeStateManager nodeStateManager = mock(NodeStateManager.class); doAnswer(invocation -> { - ActionListener> listener = invocation.getArgument(1); + ActionListener> listener = invocation.getArgument(2); listener.onResponse(Optional.of(detector)); return null; - }).when(nodeStateManager).getAnomalyDetector(any(String.class), any(ActionListener.class)); + }).when(nodeStateManager).getConfig(any(String.class), eq(AnalysisType.AD), any(ActionListener.class)); clientUtil = new SecurityClientUtil(nodeStateManager, settings); searchFeatureDao = spy( - new SearchFeatureDao(client, xContent, interpolator, clientUtil, settings, null, AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE) + new SearchFeatureDao(client, xContent, imputer, clientUtil, settings, null, TimeSeriesSettings.NUM_SAMPLES_PER_TREE) ); detectionInterval = new IntervalTimeConfiguration(1, ChronoUnit.MINUTES); detectorId = "123"; - when(detector.getDetectorId()).thenReturn(detectorId); + when(detector.getId()).thenReturn(detectorId); when(detector.getTimeField()).thenReturn("testTimeField"); when(detector.getIndices()).thenReturn(Arrays.asList("testIndices")); - when(detector.getDetectionInterval()).thenReturn(detectionInterval); + when(detector.getInterval()).thenReturn(detectionInterval); when(detector.getFilterQuery()).thenReturn(QueryBuilders.matchAllQuery()); - when(detector.getCategoryField()).thenReturn(Collections.singletonList("a")); + when(detector.getCategoryFields()).thenReturn(Collections.singletonList("a")); searchSourceBuilder = SearchSourceBuilder .fromXContent(XContentType.JSON.xContent().createParser(xContent, LoggingDeprecationHandler.INSTANCE, "{}")); @@ -372,7 +373,7 @@ public void testGetEntityMinDataTime() { ActionListener> listener = mock(ActionListener.class); Entity entity = Entity.createSingleAttributeEntity("field", "app_1"); - searchFeatureDao.getEntityMinDataTime(detector, entity, listener); + searchFeatureDao.getMinDataTime(detector, Optional.ofNullable(entity), AnalysisType.AD, listener); ArgumentCaptor> captor = ArgumentCaptor.forClass(Optional.class); verify(listener).onResponse(captor.capture()); diff --git a/src/test/java/org/opensearch/timeseries/indices/IndexManagementIntegTestCase.java b/src/test/java/org/opensearch/timeseries/indices/IndexManagementIntegTestCase.java new file mode 100644 index 000000000..56a7ef0c8 --- /dev/null +++ b/src/test/java/org/opensearch/timeseries/indices/IndexManagementIntegTestCase.java @@ -0,0 +1,104 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.indices; + +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.doAnswer; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.never; +import static org.mockito.Mockito.verify; + +import java.io.IOException; +import java.util.Arrays; +import java.util.List; +import java.util.Map; +import java.util.concurrent.CountDownLatch; +import java.util.concurrent.TimeUnit; + +import org.mockito.ArgumentCaptor; +import org.opensearch.common.settings.Settings; +import org.opensearch.common.xcontent.XContentHelper; +import org.opensearch.common.xcontent.XContentType; +import org.opensearch.core.action.ActionListener; +import org.opensearch.core.common.bytes.BytesArray; +import org.opensearch.test.OpenSearchIntegTestCase; +import org.opensearch.timeseries.common.exception.EndRunException; +import org.opensearch.timeseries.constant.CommonMessages; +import org.opensearch.timeseries.function.ExecutorFunction; + +public abstract class IndexManagementIntegTestCase & TimeSeriesIndex, ISMType extends IndexManagement> + extends OpenSearchIntegTestCase { + + public void validateCustomIndexForBackendJob(ISMType indices, String resultMapping) throws IOException, InterruptedException { + + Map asMap = XContentHelper.convertToMap(new BytesArray(resultMapping), false, XContentType.JSON).v2(); + String resultIndex = "test_index"; + + client() + .admin() + .indices() + .prepareCreate(resultIndex) + .setSettings(Settings.builder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0)) + .setMapping(asMap) + .get(); + ensureGreen(resultIndex); + + String securityLogId = "logId"; + String user = "testUser"; + List roles = Arrays.asList("role1", "role2"); + ExecutorFunction function = mock(ExecutorFunction.class); + ActionListener listener = mock(ActionListener.class); + + CountDownLatch latch = new CountDownLatch(1); + doAnswer(invocation -> { + latch.countDown(); + return null; + }).when(function).execute(); + latch.await(20, TimeUnit.SECONDS); + indices.validateCustomIndexForBackendJob(resultIndex, securityLogId, user, roles, function, listener); + verify(listener, never()).onFailure(any(Exception.class)); + } + + public void validateCustomIndexForBackendJobInvalidMapping(ISMType indices) { + String resultIndex = "test_index"; + + client() + .admin() + .indices() + .prepareCreate(resultIndex) + .setSettings(Settings.builder().put("index.number_of_shards", 1).put("index.number_of_replicas", 0)) + .setMapping("ip", "type=ip") + .get(); + ensureGreen(resultIndex); + + String securityLogId = "logId"; + String user = "testUser"; + List roles = Arrays.asList("role1", "role2"); + ExecutorFunction function = mock(ExecutorFunction.class); + ActionListener listener = mock(ActionListener.class); + + indices.validateCustomIndexForBackendJob(resultIndex, securityLogId, user, roles, function, listener); + + ArgumentCaptor exceptionCaptor = ArgumentCaptor.forClass(EndRunException.class); + verify(listener).onFailure(exceptionCaptor.capture()); + assertEquals("Result index mapping is not correct", exceptionCaptor.getValue().getMessage()); + } + + public void validateCustomIndexForBackendJobNoIndex(ISMType indices) { + String resultIndex = "testIndex"; + String securityLogId = "logId"; + String user = "testUser"; + List roles = Arrays.asList("role1", "role2"); + ExecutorFunction function = mock(ExecutorFunction.class); + ActionListener listener = mock(ActionListener.class); + + indices.validateCustomIndexForBackendJob(resultIndex, securityLogId, user, roles, function, listener); + + ArgumentCaptor exceptionCaptor = ArgumentCaptor.forClass(EndRunException.class); + verify(listener).onFailure(exceptionCaptor.capture()); + assertEquals(CommonMessages.CAN_NOT_FIND_RESULT_INDEX + resultIndex, exceptionCaptor.getValue().getMessage()); + } +} diff --git a/src/test/java/org/opensearch/timeseries/util/ClientUtilTests.java b/src/test/java/org/opensearch/timeseries/util/ClientUtilTests.java new file mode 100644 index 000000000..d4241fc4f --- /dev/null +++ b/src/test/java/org/opensearch/timeseries/util/ClientUtilTests.java @@ -0,0 +1,153 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.util; + +import static org.hamcrest.Matchers.equalTo; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.eq; +import static org.mockito.Mockito.doAnswer; +import static org.mockito.Mockito.mock; + +import java.util.Collections; +import java.util.concurrent.CountDownLatch; +import java.util.concurrent.TimeUnit; +import java.util.function.BiConsumer; + +import org.opensearch.action.LatchedActionListener; +import org.opensearch.ad.transport.AnomalyResultAction; +import org.opensearch.ad.transport.AnomalyResultRequest; +import org.opensearch.ad.transport.AnomalyResultResponse; +import org.opensearch.client.Client; +import org.opensearch.core.action.ActionListener; +import org.opensearch.test.OpenSearchTestCase; +import org.opensearch.timeseries.model.FeatureData; + +public class ClientUtilTests extends OpenSearchTestCase { + private AnomalyResultRequest asyncRequest; + + private ClientUtil clientUtil; + + private Client client; + private CountDownLatch latch; + private ActionListener latchListener; + private AnomalyResultResponse actualResponse; + private Exception exception; + private ActionListener listener; + + @Override + public void setUp() throws Exception { + super.setUp(); + asyncRequest = new AnomalyResultRequest("abc123", 100, 200); + + listener = new ActionListener<>() { + @Override + public void onResponse(AnomalyResultResponse resultResponse) { + actualResponse = resultResponse; + } + + @Override + public void onFailure(Exception e) { + exception = e; + } + }; + actualResponse = null; + exception = null; + + latch = new CountDownLatch(1); + latchListener = new LatchedActionListener<>(listener, latch); + + client = mock(Client.class); + clientUtil = new ClientUtil(client); + } + + public void testAsyncRequestOnSuccess() throws InterruptedException { + AnomalyResultResponse expected = new AnomalyResultResponse( + 4d, + 0.993, + 1.01, + Collections.singletonList(new FeatureData("xyz", "foo", 0d)), + randomAlphaOfLength(4), + randomLong(), + randomLong(), + randomBoolean(), + randomInt(), + new double[] { randomDoubleBetween(0, 1.0, true), randomDoubleBetween(0, 1.0, true) }, + new double[] { randomDouble(), randomDouble() }, + new double[][] { new double[] { randomDouble(), randomDouble() } }, + new double[] { randomDouble() }, + randomDoubleBetween(1.1, 10.0, true) + ); + BiConsumer> consumer = (request, actionListener) -> { + // simulate successful operation + // actionListener.onResponse(); + latchListener.onResponse(expected); + }; + clientUtil.asyncRequest(asyncRequest, consumer, listener); + + assertTrue(latch.await(30L, TimeUnit.SECONDS)); + assertNotNull(actualResponse); + assertNull(exception); + org.hamcrest.MatcherAssert.assertThat(actualResponse, equalTo(expected)); + } + + public void testAsyncRequestOnFailure() { + Exception testException = new Exception("Test exception"); + BiConsumer> consumer = (request, actionListener) -> { + // simulate successful operation + // actionListener.onResponse(); + latchListener.onFailure(testException); + }; + clientUtil.asyncRequest(asyncRequest, consumer, listener); + assertNull(actualResponse); + assertNotNull(exception); + assertEquals("Test exception", exception.getMessage()); + } + + @SuppressWarnings("unchecked") + public void testExecuteOnSuccess() throws InterruptedException { + AnomalyResultResponse expected = new AnomalyResultResponse( + 4d, + 0.993, + 1.01, + Collections.singletonList(new FeatureData("xyz", "foo", 0d)), + randomAlphaOfLength(4), + randomLong(), + randomLong(), + randomBoolean(), + randomInt(), + new double[] { randomDoubleBetween(0, 1.0, true), randomDoubleBetween(0, 1.0, true) }, + new double[] { randomDouble(), randomDouble() }, + new double[][] { new double[] { randomDouble(), randomDouble() } }, + new double[] { randomDouble() }, + randomDoubleBetween(1.1, 10.0, true) + ); + doAnswer(invocationOnMock -> { + ((ActionListener) invocationOnMock.getArguments()[2]).onResponse(expected); + latch.countDown(); + return null; + }).when(client).execute(eq(AnomalyResultAction.INSTANCE), any(), any()); + clientUtil.execute(AnomalyResultAction.INSTANCE, asyncRequest, latchListener); + + assertTrue(latch.await(30L, TimeUnit.SECONDS)); + assertNotNull(actualResponse); + assertNull(exception); + org.hamcrest.MatcherAssert.assertThat(actualResponse, equalTo(expected)); + } + + @SuppressWarnings("unchecked") + public void testExecuteOnFailure() { + Exception testException = new Exception("Test exception"); + doAnswer(invocationOnMock -> { + ((ActionListener) invocationOnMock.getArguments()[2]).onFailure(testException); + latch.countDown(); + return null; + }).when(client).execute(eq(AnomalyResultAction.INSTANCE), any(), any()); + clientUtil.execute(AnomalyResultAction.INSTANCE, asyncRequest, latchListener); + assertNull(actualResponse); + assertNotNull(exception); + assertEquals("Test exception", exception.getMessage()); + } +} diff --git a/src/test/java/org/opensearch/timeseries/util/LTrimTests.java b/src/test/java/org/opensearch/timeseries/util/LTrimTests.java new file mode 100644 index 000000000..384982828 --- /dev/null +++ b/src/test/java/org/opensearch/timeseries/util/LTrimTests.java @@ -0,0 +1,43 @@ +/* + * Copyright OpenSearch Contributors + * SPDX-License-Identifier: Apache-2.0 + */ + +package org.opensearch.timeseries.util; + +import org.opensearch.test.OpenSearchTestCase; + +public class LTrimTests extends OpenSearchTestCase { + + public void testLtrimEmptyArray() { + + double[][] input = {}; + double[][] expectedOutput = {}; + + assertArrayEquals(expectedOutput, DataUtil.ltrim(input)); + } + + public void testLtrimAllNaN() { + + double[][] input = { { Double.NaN, Double.NaN }, { Double.NaN, Double.NaN }, { Double.NaN, Double.NaN } }; + double[][] expectedOutput = {}; + + assertArrayEquals(expectedOutput, DataUtil.ltrim(input)); + } + + public void testLtrimSomeNaN() { + + double[][] input = { { Double.NaN, Double.NaN }, { 1.0, 2.0 }, { 3.0, 4.0 } }; + double[][] expectedOutput = { { 1.0, 2.0 }, { 3.0, 4.0 } }; + + assertArrayEquals(expectedOutput, DataUtil.ltrim(input)); + } + + public void testLtrimNoNaN() { + + double[][] input = { { 1.0, 2.0 }, { 3.0, 4.0 }, { 5.0, 6.0 } }; + double[][] expectedOutput = { { 1.0, 2.0 }, { 3.0, 4.0 }, { 5.0, 6.0 } }; + + assertArrayEquals(expectedOutput, DataUtil.ltrim(input)); + } +} diff --git a/src/test/java/org/opensearch/ad/util/MultiResponsesDelegateActionListenerTests.java b/src/test/java/org/opensearch/timeseries/util/MultiResponsesDelegateActionListenerTests.java similarity index 96% rename from src/test/java/org/opensearch/ad/util/MultiResponsesDelegateActionListenerTests.java rename to src/test/java/org/opensearch/timeseries/util/MultiResponsesDelegateActionListenerTests.java index f879a71c8..b21ca79bf 100644 --- a/src/test/java/org/opensearch/ad/util/MultiResponsesDelegateActionListenerTests.java +++ b/src/test/java/org/opensearch/timeseries/util/MultiResponsesDelegateActionListenerTests.java @@ -9,11 +9,11 @@ * GitHub history for details. */ -package org.opensearch.ad.util; +package org.opensearch.timeseries.util; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.verify; -import static org.opensearch.ad.TestHelpers.randomHCADAnomalyDetectResult; +import static org.opensearch.timeseries.TestHelpers.randomHCADAnomalyDetectResult; import java.util.ArrayList; import java.util.concurrent.CountDownLatch; diff --git a/src/test/java/test/org/opensearch/ad/util/FakeNode.java b/src/test/java/test/org/opensearch/ad/util/FakeNode.java index 15dfe9f3c..1fc43e62d 100644 --- a/src/test/java/test/org/opensearch/ad/util/FakeNode.java +++ b/src/test/java/test/org/opensearch/ad/util/FakeNode.java @@ -28,7 +28,6 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.core.Logger; import org.apache.lucene.util.SetOnce; -import org.opensearch.BwcTests; import org.opensearch.Version; import org.opensearch.action.admin.cluster.node.tasks.cancel.TransportCancelTasksAction; import org.opensearch.action.admin.cluster.node.tasks.list.TransportListTasksAction; @@ -76,7 +75,7 @@ public FakeNode( Settings.EMPTY, new MockNioTransport( Settings.EMPTY, - BwcTests.V_1_1_0, + Version.V_2_1_0, threadPool, new NetworkService(Collections.emptyList()), PageCacheRecycler.NON_RECYCLING_INSTANCE, diff --git a/src/test/java/test/org/opensearch/ad/util/MLUtil.java b/src/test/java/test/org/opensearch/ad/util/MLUtil.java index 3656e3ab5..6b6bb39af 100644 --- a/src/test/java/test/org/opensearch/ad/util/MLUtil.java +++ b/src/test/java/test/org/opensearch/ad/util/MLUtil.java @@ -24,9 +24,9 @@ import org.opensearch.ad.ml.EntityModel; import org.opensearch.ad.ml.ModelManager.ModelType; import org.opensearch.ad.ml.ModelState; -import org.opensearch.ad.model.Entity; -import org.opensearch.ad.settings.AnomalyDetectorSettings; import org.opensearch.common.collect.Tuple; +import org.opensearch.timeseries.model.Entity; +import org.opensearch.timeseries.settings.TimeSeriesSettings; import com.amazon.randomcutforest.config.TransformMethod; import com.amazon.randomcutforest.parkservices.ThresholdedRandomCutForest; @@ -39,7 +39,7 @@ */ public class MLUtil { private static Random random = new Random(42); - private static int minSampleSize = AnomalyDetectorSettings.NUM_MIN_SAMPLES; + private static int minSampleSize = TimeSeriesSettings.NUM_MIN_SAMPLES; private static String randomString(int targetStringLength) { int leftLimit = 97; // letter 'a' @@ -62,7 +62,7 @@ public static Queue createQueueSamples(int size) { public static ModelState randomModelState(RandomModelStateConfig config) { boolean fullModel = config.getFullModel() != null && config.getFullModel().booleanValue() ? true : false; float priority = config.getPriority() != null ? config.getPriority() : random.nextFloat(); - String detectorId = config.getDetectorId() != null ? config.getDetectorId() : randomString(15); + String detectorId = config.getId() != null ? config.getId() : randomString(15); int sampleSize = config.getSampleSize() != null ? config.getSampleSize() : random.nextInt(minSampleSize); Clock clock = config.getClock() != null ? config.getClock() : Clock.systemUTC(); @@ -96,19 +96,19 @@ public static EntityModel createEmptyModel(Entity entity) { public static EntityModel createNonEmptyModel(String detectorId, int sampleSize, Entity entity) { Queue samples = createQueueSamples(sampleSize); - int numDataPoints = random.nextInt(1000) + AnomalyDetectorSettings.NUM_MIN_SAMPLES; + int numDataPoints = random.nextInt(1000) + TimeSeriesSettings.NUM_MIN_SAMPLES; ThresholdedRandomCutForest trcf = new ThresholdedRandomCutForest( ThresholdedRandomCutForest .builder() .dimensions(1) - .sampleSize(AnomalyDetectorSettings.NUM_SAMPLES_PER_TREE) - .numberOfTrees(AnomalyDetectorSettings.NUM_TREES) - .timeDecay(AnomalyDetectorSettings.TIME_DECAY) - .outputAfter(AnomalyDetectorSettings.NUM_MIN_SAMPLES) + .sampleSize(TimeSeriesSettings.NUM_SAMPLES_PER_TREE) + .numberOfTrees(TimeSeriesSettings.NUM_TREES) + .timeDecay(TimeSeriesSettings.TIME_DECAY) + .outputAfter(TimeSeriesSettings.NUM_MIN_SAMPLES) .initialAcceptFraction(0.125d) .parallelExecutionEnabled(false) .internalShinglingEnabled(true) - .anomalyRate(1 - AnomalyDetectorSettings.THRESHOLD_MIN_PVALUE) + .anomalyRate(1 - TimeSeriesSettings.THRESHOLD_MIN_PVALUE) .transformMethod(TransformMethod.NORMALIZE) .alertOnce(true) .autoAdjust(true) diff --git a/src/test/java/test/org/opensearch/ad/util/RandomModelStateConfig.java b/src/test/java/test/org/opensearch/ad/util/RandomModelStateConfig.java index 757ba1bc5..25a2da1bd 100644 --- a/src/test/java/test/org/opensearch/ad/util/RandomModelStateConfig.java +++ b/src/test/java/test/org/opensearch/ad/util/RandomModelStateConfig.java @@ -38,7 +38,7 @@ public Float getPriority() { return priority; } - public String getDetectorId() { + public String getId() { return detectorId; } diff --git a/src/test/resources/security/sample.pem b/src/test/resources/security/sample.pem index 671f94539..a1fc20a77 100644 --- a/src/test/resources/security/sample.pem +++ b/src/test/resources/security/sample.pem @@ -22,4 +22,4 @@ zTclAzfQhqmKBTYQ/3lJ3GhRQvXIdYTe+t4aq78TCawp1nSN+vdH/1geG6QjMn5N 1FU8tovDd4x8Ib/0dv8RJx+n9gytI8n/giIaDCEbfLLpe4EkV5e5UNpOnRgJjjuy vtZutc81TQnzBtkS9XuulovDE0qI+jQrKkKu8xgGLhgH0zxnPkKtUg2I3Aq6zl1L zYkEOUF8Y25J6WeY88Yfnc0iigI+Pnz5NK8R9GL7TYo= ------END CERTIFICATE----- \ No newline at end of file +-----END CERTIFICATE-----