Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhance changelog workflow to check for missing labels #163

Open
wants to merge 16 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
16 commits
Select commit Hold shift + click to select a range
4fc3a02
[Snapshot Interop] Add changes for overriding remote store and replic…
harishbhakuni Apr 11, 2024
645b1f1
[Derived Fields] PR4: Capability to define derived fields in search r…
rishabhmaurya Apr 11, 2024
7103e56
Add ShardBatchCache to support caching for TransportNodesListGatewayS…
amkhar Apr 11, 2024
5d939b9
[BUG] Fix org.opensearch.common.xcontent.XContentParserTests.testStri…
reta Apr 11, 2024
52ce070
[Tiered Caching] Make took time policy dynamic and add additional int…
sgup432 Apr 11, 2024
02e3c56
Replace the right version to fix backward compatibility introduced wi…
rishabhmaurya Apr 11, 2024
7345371
Implement interface changes for s3 plugin to read/write blob with obj…
skumawat2025 Apr 12, 2024
c168e1c
Update index settings during remote store migration (#12748)
ltaragi Apr 12, 2024
f2e2a85
Refactoring globMatch using simpleMatchWithNormalizedStrings from Reg…
niyatiagg Apr 12, 2024
cc22310
[Tiered Caching] Stats rework (1/3): Interfaces and implementations f…
peteralfonsi Apr 12, 2024
e828c18
Fix flakiness with SegmentReplicationSuiteIT (#11977)
mch2 Apr 14, 2024
6bc04b4
[segment replication] decouple the rateLimiter of segrep and recovery…
Ferrari248 Apr 14, 2024
416083c
Bump gradle/wrapper-validation-action from 2 to 3 (#13192)
dependabot[bot] Apr 15, 2024
39ac2df
Update to Apache Lucene 9.11.0-snapshot-fb97840 (#13195)
reta Apr 15, 2024
00df37e
[Tiered Caching] Ehcache Disk cache IT (#12904)
sgup432 Apr 15, 2024
1212672
Enhance changelog workflow to check for missing labels
kotwanikunal Apr 10, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 13 additions & 0 deletions .github/workflows/changelog_verifier.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,19 @@ jobs:
changeLogPath: 'CHANGELOG.md'
continue-on-error: true
- run: |
# The check was possibly skipped leading to success for both the jobs
if [[ ${{ steps.verify-changelog-3x.outcome }} == 'success' && ${{ steps.verify-changelog.outcome }} == 'success' ]]; then
exit 0
fi

if [[ ${{ steps.verify-changelog-3x.outcome }} == 'failure' && ${{ steps.verify-changelog.outcome }} == 'failure' ]]; then
echo "error: Please ensure a changelog entry exists in CHANGELOG.md or CHANGELOG-3.0.md"
exit 1
fi

# Concatenates the labels and checks if the string contains "backport"
has_backport_label=${{ contains(join(github.event.pull_request.labels.*.name, ', '), 'backport')}}
if [[ ${{ steps.verify-changelog.outcome }} == 'success' && $has_backport_label == false ]]; then
echo "error: Please make sure that the PR has a backport label associated with it when making an entry to the CHANGELOG.md file"
exit 1
fi
2 changes: 1 addition & 1 deletion .github/workflows/wrapper.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,4 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: gradle/wrapper-validation-action@v2
- uses: gradle/wrapper-validation-action@v3
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,12 +11,16 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Allow setting KEYSTORE_PASSWORD through env variable ([#12865](https://github.com/opensearch-project/OpenSearch/pull/12865))
- [Concurrent Segment Search] Perform buildAggregation concurrently and support Composite Aggregations ([#12697](https://github.com/opensearch-project/OpenSearch/pull/12697))
- [Concurrent Segment Search] Disable concurrent segment search for system indices and throttled requests ([#12954](https://github.com/opensearch-project/OpenSearch/pull/12954))
- [Tiered Caching] Make took time caching policy setting dynamic ([#13063](https://github.com/opensearch-project/OpenSearch/pull/13063))
- Derived fields support to derive field values at query time without indexing ([#12569](https://github.com/opensearch-project/OpenSearch/pull/12569))
- Detect breaking changes on pull requests ([#9044](https://github.com/opensearch-project/OpenSearch/pull/9044))
- Add cluster primary balance contraint for rebalancing with buffer ([#12656](https://github.com/opensearch-project/OpenSearch/pull/12656))
- [Remote Store] Make translog transfer timeout configurable ([#12704](https://github.com/opensearch-project/OpenSearch/pull/12704))
- Reject Resize index requests (i.e, split, shrink and clone), While DocRep to SegRep migration is in progress.([#12686](https://github.com/opensearch-project/OpenSearch/pull/12686))
- Add support for more than one protocol for transport ([#12967](https://github.com/opensearch-project/OpenSearch/pull/12967))
- [Tiered Caching] Add dimension-based stats to ICache implementations. ([#12531](https://github.com/opensearch-project/OpenSearch/pull/12531))
- Add changes for overriding remote store and replication settings during snapshot restore. ([#11868](https://github.com/opensearch-project/OpenSearch/pull/11868))
- Add an individual setting of rate limiter for segment replication ([#12959](https://github.com/opensearch-project/OpenSearch/pull/12959))

### Dependencies
- Bump `org.apache.commons:commons-configuration2` from 2.10.0 to 2.10.1 ([#12896](https://github.com/opensearch-project/OpenSearch/pull/12896))
Expand All @@ -29,11 +33,13 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Bump `org.apache.commons:commonslang` from 3.13.0 to 3.14.0 ([#12627](https://github.com/opensearch-project/OpenSearch/pull/12627))
- Bump Apache Tika from 2.6.0 to 2.9.2 ([#12627](https://github.com/opensearch-project/OpenSearch/pull/12627))
- Bump `com.gradle.enterprise` from 3.16.2 to 3.17 ([#13116](https://github.com/opensearch-project/OpenSearch/pull/13116))
- Bump `gradle/wrapper-validation-action` from 2 to 3 ([#13192](https://github.com/opensearch-project/OpenSearch/pull/13192))

### Changed
- [BWC and API enforcement] Enforcing the presence of API annotations at build time ([#12872](https://github.com/opensearch-project/OpenSearch/pull/12872))
- Improve built-in secure transports support ([#12907](https://github.com/opensearch-project/OpenSearch/pull/12907))
- Update links to documentation in rest-api-spec ([#13043](https://github.com/opensearch-project/OpenSearch/pull/13043))
- Refactoring globMatch using simpleMatchWithNormalizedStrings from Regex ([#13104](https://github.com/opensearch-project/OpenSearch/pull/13104))

### Deprecated

Expand Down
2 changes: 1 addition & 1 deletion buildSrc/version.properties
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
opensearch = 3.0.0
lucene = 9.11.0-snapshot-8a555eb
lucene = 9.11.0-snapshot-fb97840

bundled_jdk_vendor = adoptium
bundled_jdk = 21.0.2+13
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,15 +54,19 @@
import org.opensearch.action.search.SearchScrollRequest;
import org.opensearch.client.core.CountRequest;
import org.opensearch.client.core.CountResponse;
import org.opensearch.common.geo.ShapeRelation;
import org.opensearch.common.unit.TimeValue;
import org.opensearch.common.xcontent.XContentFactory;
import org.opensearch.core.common.bytes.BytesReference;
import org.opensearch.core.rest.RestStatus;
import org.opensearch.core.xcontent.MediaTypeRegistry;
import org.opensearch.core.xcontent.XContentBuilder;
import org.opensearch.geometry.Rectangle;
import org.opensearch.index.query.GeoShapeQueryBuilder;
import org.opensearch.index.query.MatchQueryBuilder;
import org.opensearch.index.query.QueryBuilder;
import org.opensearch.index.query.QueryBuilders;
import org.opensearch.index.query.RangeQueryBuilder;
import org.opensearch.index.query.ScriptQueryBuilder;
import org.opensearch.index.query.TermsQueryBuilder;
import org.opensearch.join.aggregations.Children;
Expand Down Expand Up @@ -102,6 +106,8 @@
import org.opensearch.search.suggest.Suggest;
import org.opensearch.search.suggest.SuggestBuilder;
import org.opensearch.search.suggest.phrase.PhraseSuggestionBuilder;
import org.joda.time.DateTime;
import org.joda.time.DateTimeZone;
import org.hamcrest.Matchers;
import org.junit.Before;

Expand All @@ -116,6 +122,7 @@
import java.util.concurrent.TimeUnit;

import static org.opensearch.common.xcontent.XContentFactory.jsonBuilder;
import static org.opensearch.index.query.QueryBuilders.geoShapeQuery;
import static org.opensearch.test.hamcrest.OpenSearchAssertions.assertToXContentEquivalent;
import static org.hamcrest.Matchers.arrayContaining;
import static org.hamcrest.Matchers.both;
Expand Down Expand Up @@ -764,6 +771,228 @@ public void testSearchWithWeirdScriptFields() throws Exception {
}
}

public void testSearchWithDerivedFields() throws Exception {
// Just testing DerivedField definition from SearchSourceBuilder derivedField()
// We are not testing the full functionality here
Request doc = new Request("PUT", "test/_doc/1");
doc.setJsonEntity("{\"field\":\"value\"}");
client().performRequest(doc);
client().performRequest(new Request("POST", "/test/_refresh"));
// Keyword field
{
SearchRequest searchRequest = new SearchRequest("test").source(
SearchSourceBuilder.searchSource()
.derivedField("result", "keyword", new Script("emit(params._source[\"field\"])"))
.fetchField("result")
.query(new TermsQueryBuilder("result", "value"))
);
SearchResponse searchResponse = execute(searchRequest, highLevelClient()::search, highLevelClient()::searchAsync);
SearchHit searchHit = searchResponse.getHits().getAt(0);
List<Object> values = searchHit.getFields().get("result").getValues();
assertNotNull(values);
assertEquals(1, values.size());
assertEquals("value", values.get(0));

// multi valued
searchRequest = new SearchRequest("test").source(
SearchSourceBuilder.searchSource()
.derivedField(
"result",
"keyword",
new Script("emit(params._source[\"field\"]);emit(params._source[\"field\"] + \"_2\")")
)
.query(new TermsQueryBuilder("result", "value_2"))
.fetchField("result")
);
searchResponse = execute(searchRequest, highLevelClient()::search, highLevelClient()::searchAsync);
searchHit = searchResponse.getHits().getAt(0);
values = searchHit.getFields().get("result").getValues();
assertNotNull(values);
assertEquals(2, values.size());
assertEquals("value", values.get(0));
assertEquals("value_2", values.get(1));
}
// Boolean field
{
SearchRequest searchRequest = new SearchRequest("test").source(
SearchSourceBuilder.searchSource()
.derivedField("result", "boolean", new Script("emit(((String)params._source[\"field\"]).equals(\"value\"))"))
.query(new TermsQueryBuilder("result", "true"))
.fetchField("result")
);
SearchResponse searchResponse = execute(searchRequest, highLevelClient()::search, highLevelClient()::searchAsync);
SearchHit searchHit = searchResponse.getHits().getAt(0);
List<Object> values = searchHit.getFields().get("result").getValues();
assertNotNull(values);
assertEquals(1, values.size());
assertEquals(true, values.get(0));
}
// Long field
{
SearchRequest searchRequest = new SearchRequest("test").source(
SearchSourceBuilder.searchSource()
.derivedField("result", "long", new Script("emit(Long.MAX_VALUE)"))
.query(new RangeQueryBuilder("result").from(Long.MAX_VALUE - 1).to(Long.MAX_VALUE))
.fetchField("result")
);

SearchResponse searchResponse = execute(searchRequest, highLevelClient()::search, highLevelClient()::searchAsync);
SearchHit searchHit = searchResponse.getHits().getAt(0);
List<Object> values = searchHit.getFields().get("result").getValues();
assertNotNull(values);
assertEquals(1, values.size());
assertEquals(Long.MAX_VALUE, values.get(0));

// multi-valued
searchRequest = new SearchRequest("test").source(
SearchSourceBuilder.searchSource()
.derivedField("result", "long", new Script("emit(Long.MAX_VALUE); emit(Long.MIN_VALUE);"))
.query(new RangeQueryBuilder("result").from(Long.MIN_VALUE).to(Long.MIN_VALUE + 1))
.fetchField("result")
);

searchResponse = execute(searchRequest, highLevelClient()::search, highLevelClient()::searchAsync);
searchHit = searchResponse.getHits().getAt(0);
values = searchHit.getFields().get("result").getValues();
assertNotNull(values);
assertEquals(2, values.size());
assertEquals(Long.MAX_VALUE, values.get(0));
assertEquals(Long.MIN_VALUE, values.get(1));
}
// Double field
{
SearchRequest searchRequest = new SearchRequest("test").source(
SearchSourceBuilder.searchSource()
.derivedField("result", "double", new Script("emit(Double.MAX_VALUE)"))
.query(new RangeQueryBuilder("result").from(Double.MAX_VALUE - 1).to(Double.MAX_VALUE))
.fetchField("result")
);
SearchResponse searchResponse = execute(searchRequest, highLevelClient()::search, highLevelClient()::searchAsync);
SearchHit searchHit = searchResponse.getHits().getAt(0);
List<Object> values = searchHit.getFields().get("result").getValues();
assertNotNull(values);
assertEquals(1, values.size());
assertEquals(Double.MAX_VALUE, values.get(0));

// multi-valued
searchRequest = new SearchRequest("test").source(
SearchSourceBuilder.searchSource()
.derivedField("result", "double", new Script("emit(Double.MAX_VALUE); emit(Double.MIN_VALUE);"))
.query(new RangeQueryBuilder("result").from(Double.MIN_VALUE).to(Double.MIN_VALUE + 1))
.fetchField("result")
);

searchResponse = execute(searchRequest, highLevelClient()::search, highLevelClient()::searchAsync);
searchHit = searchResponse.getHits().getAt(0);
values = searchHit.getFields().get("result").getValues();
assertNotNull(values);
assertEquals(2, values.size());
assertEquals(Double.MAX_VALUE, values.get(0));
assertEquals(Double.MIN_VALUE, values.get(1));
}
// Date field
{
DateTime date1 = new DateTime(1990, 12, 29, 0, 0, DateTimeZone.UTC);
DateTime date2 = new DateTime(1990, 12, 30, 0, 0, DateTimeZone.UTC);
SearchRequest searchRequest = new SearchRequest("test").source(
SearchSourceBuilder.searchSource()
.derivedField("result", "date", new Script("emit(" + date1.getMillis() + "L)"))
.query(new RangeQueryBuilder("result").from(date1.toString()).to(date2.toString()))
.fetchField("result")
);

SearchResponse searchResponse = execute(searchRequest, highLevelClient()::search, highLevelClient()::searchAsync);
SearchHit searchHit = searchResponse.getHits().getAt(0);
List<Object> values = searchHit.getFields().get("result").getValues();
assertNotNull(values);
assertEquals(1, values.size());
assertEquals(date1.toString(), values.get(0));

// multi-valued
searchRequest = new SearchRequest("test").source(
SearchSourceBuilder.searchSource()
.derivedField("result", "date", new Script("emit(" + date1.getMillis() + "L); " + "emit(" + date2.getMillis() + "L)"))
.query(new RangeQueryBuilder("result").from(date1.toString()).to(date2.toString()))
.fetchField("result")
);

searchResponse = execute(searchRequest, highLevelClient()::search, highLevelClient()::searchAsync);
searchHit = searchResponse.getHits().getAt(0);
values = searchHit.getFields().get("result").getValues();
assertNotNull(values);
assertEquals(2, values.size());
assertEquals(date1.toString(), values.get(0));
assertEquals(date2.toString(), values.get(1));
}
// Geo field
{
GeoShapeQueryBuilder qb = geoShapeQuery("result", new Rectangle(-35, 35, 35, -35));
qb.relation(ShapeRelation.INTERSECTS);
SearchRequest searchRequest = new SearchRequest("test").source(
SearchSourceBuilder.searchSource()
.derivedField("result", "geo_point", new Script("emit(10.0, 20.0)"))
.query(qb)
.fetchField("result")
);

SearchResponse searchResponse = execute(searchRequest, highLevelClient()::search, highLevelClient()::searchAsync);
SearchHit searchHit = searchResponse.getHits().getAt(0);
List<Object> values = searchHit.getFields().get("result").getValues();
assertNotNull(values);
assertEquals(1, values.size());
assertEquals(10.0, ((HashMap) values.get(0)).get("lat"));
assertEquals(20.0, ((HashMap) values.get(0)).get("lon"));

// multi-valued
searchRequest = new SearchRequest("test").source(
SearchSourceBuilder.searchSource()
.derivedField("result", "geo_point", new Script("emit(10.0, 20.0); emit(20.0, 30.0);"))
.query(qb)
.fetchField("result")
);

searchResponse = execute(searchRequest, highLevelClient()::search, highLevelClient()::searchAsync);
searchHit = searchResponse.getHits().getAt(0);
values = searchHit.getFields().get("result").getValues();
assertNotNull(values);
assertEquals(2, values.size());
assertEquals(10.0, ((HashMap) values.get(0)).get("lat"));
assertEquals(20.0, ((HashMap) values.get(0)).get("lon"));
assertEquals(20.0, ((HashMap) values.get(1)).get("lat"));
assertEquals(30.0, ((HashMap) values.get(1)).get("lon"));
}
// IP field
{
SearchRequest searchRequest = new SearchRequest("test").source(
SearchSourceBuilder.searchSource().derivedField("result", "ip", new Script("emit(\"10.0.0.1\")")).fetchField("result")
);

SearchResponse searchResponse = execute(searchRequest, highLevelClient()::search, highLevelClient()::searchAsync);
SearchHit searchHit = searchResponse.getHits().getAt(0);
List<Object> values = searchHit.getFields().get("result").getValues();
assertNotNull(values);
assertEquals(1, values.size());
assertEquals("10.0.0.1", values.get(0));

// multi-valued
searchRequest = new SearchRequest("test").source(
SearchSourceBuilder.searchSource()
.derivedField("result", "ip", new Script("emit(\"10.0.0.1\"); emit(\"10.0.0.2\");"))
.fetchField("result")
);

searchResponse = execute(searchRequest, highLevelClient()::search, highLevelClient()::searchAsync);
searchHit = searchResponse.getHits().getAt(0);
values = searchHit.getFields().get("result").getValues();
assertNotNull(values);
assertEquals(2, values.size());
assertEquals("10.0.0.1", values.get(0));
assertEquals("10.0.0.2", values.get(1));

}

}

public void testSearchScroll() throws Exception {
for (int i = 0; i < 100; i++) {
XContentBuilder builder = jsonBuilder().startObject().field("field", i).endObject();
Expand Down
Loading
Loading