Skip to content

Commit

Permalink
Merge branch 'main' into add-alibabacloud-inference
Browse files Browse the repository at this point in the history
* main: (39 commits)
  Update README.asciidoc (elastic#111244)
  ESQL: INLINESTATS (elastic#109583)
  ESQL: Document a little of `DataType` (elastic#111250)
  Relax assertions in segment level field stats (elastic#111243)
  LogsDB data generator - support nested object field (elastic#111206)
  Validate `Authorization` header in Azure test fixture (elastic#111242)
  Fixing HistoryStoreTests.testPut() and testStoreWithHideSecrets() (elastic#111246)
  [ESQL] Remove Named Expcted Types map from testing infrastructure  (elastic#111213)
  Change visibility of createWriter to allow tests from a different package to override it (elastic#111234)
  [ES|QL] Remove EsqlDataTypes (elastic#111089)
  Mute org.elasticsearch.repositories.azure.AzureBlobContainerRetriesTests testReadNonexistentBlobThrowsNoSuchFileException elastic#111233
  Abstract codec lookup by name, to make CodecService extensible (elastic#111007)
  Add HTTPS support to `AzureHttpFixture` (elastic#111228)
  Unmuting tests related to free_context action being processed in ESSingleNodeTestCase (elastic#111224)
  Upgrade Azure SDK (elastic#111225)
  Collapse transport versions for 8.14.0 (elastic#111199)
  Make sure contender uses logs templates (elastic#111183)
  unmute HistogramPercentileAggregationTests.testBoxplotHistogram (elastic#111223)
  Refactor Quality Assurance test infrastructure (elastic#111195)
  Mute org.elasticsearch.xpack.restart.FullClusterRestartIT testDisableFieldNameField {cluster=UPGRADED} elastic#111222
  ...

# Conflicts:
#	server/src/main/java/org/elasticsearch/TransportVersions.java
  • Loading branch information
weizijun committed Jul 25, 2024
2 parents 4818434 + 6621816 commit 73a5f44
Show file tree
Hide file tree
Showing 341 changed files with 5,180 additions and 1,401 deletions.
2 changes: 1 addition & 1 deletion README.asciidoc
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
= Elasticsearch

Elasticsearch is a distributed search and analytics engine optimized for speed and relevance on production-scale workloads. Elasticsearch is the foundation of Elastic's open Stack platform. Search in near real-time over massive datasets, perform vector searches, integrate with generative AI applications, and much more.
Elasticsearch is a distributed search and analytics engine, scalable data store and vector database optimized for speed and relevance on production-scale workloads. Elasticsearch is the foundation of Elastic's open Stack platform. Search in near real-time over massive datasets, perform vector searches, integrate with generative AI applications, and much more.

Use cases enabled by Elasticsearch include:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ import org.elasticsearch.gradle.OS
import org.elasticsearch.gradle.internal.test.AntFixture
import org.gradle.api.file.FileSystemOperations
import org.gradle.api.file.ProjectLayout
import org.gradle.api.provider.ProviderFactory
import org.gradle.api.tasks.Internal
import org.gradle.process.ExecOperations

Expand All @@ -24,14 +25,17 @@ abstract class AntFixtureStop extends LoggedExec implements FixtureStop {
AntFixture fixture

@Inject
AntFixtureStop(ProjectLayout projectLayout, ExecOperations execOperations, FileSystemOperations fileSystemOperations) {
super(projectLayout, execOperations, fileSystemOperations)
AntFixtureStop(ProjectLayout projectLayout,
ExecOperations execOperations,
FileSystemOperations fileSystemOperations,
ProviderFactory providerFactory) {
super(projectLayout, execOperations, fileSystemOperations, providerFactory)
}

void setFixture(AntFixture fixture) {
assert this.fixture == null
this.fixture = fixture;
final Object pid = "${ -> this.fixture.pid }"
final Object pid = "${-> this.fixture.pid}"
onlyIf("pidFile exists") { fixture.pidFile.exists() }
doFirst {
logger.info("Shutting down ${fixture.name} with pid ${pid}")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,6 @@
import java.nio.file.Files;
import java.time.ZoneOffset;
import java.time.ZonedDateTime;
import java.util.Arrays;
import java.util.List;
import java.util.Locale;
import java.util.Optional;
Expand Down Expand Up @@ -315,36 +314,6 @@ private Optional<File> findRuntimeJavaHome() {
return env == null ? Optional.empty() : Optional.of(new File(env));
}

@NotNull
private String resolveJavaHomeFromEnvVariable(String javaHomeEnvVar) {
Provider<String> javaHomeNames = providers.gradleProperty("org.gradle.java.installations.fromEnv");
// Provide a useful error if we're looking for a Java home version that we haven't told Gradle about yet
Arrays.stream(javaHomeNames.get().split(","))
.filter(s -> s.equals(javaHomeEnvVar))
.findFirst()
.orElseThrow(
() -> new GradleException(
"Environment variable '"
+ javaHomeEnvVar
+ "' is not registered with Gradle installation supplier. Ensure 'org.gradle.java.installations.fromEnv' is "
+ "updated in gradle.properties file."
)
);
String versionedJavaHome = System.getenv(javaHomeEnvVar);
if (versionedJavaHome == null) {
final String exceptionMessage = String.format(
Locale.ROOT,
"$%s must be set to build Elasticsearch. "
+ "Note that if the variable was just set you "
+ "might have to run `./gradlew --stop` for "
+ "it to be picked up. See https://github.com/elastic/elasticsearch/issues/31399 details.",
javaHomeEnvVar
);
throw new GradleException(exceptionMessage);
}
return versionedJavaHome;
}

@NotNull
private File resolveJavaHomeFromToolChainService(String version) {
Property<JavaLanguageVersion> value = objectFactory.property(JavaLanguageVersion.class).value(JavaLanguageVersion.of(version));
Expand All @@ -354,10 +323,6 @@ private File resolveJavaHomeFromToolChainService(String version) {
return javaLauncherProvider.get().getMetadata().getInstallationPath().getAsFile();
}

private static String getJavaHomeEnvVarName(String version) {
return "JAVA" + version + "_HOME";
}

public static String getResourceContents(String resourcePath) {
try (
BufferedReader reader = new BufferedReader(new InputStreamReader(GlobalBuildInfoPlugin.class.getResourceAsStream(resourcePath)))
Expand Down
34 changes: 32 additions & 2 deletions build-tools/src/main/java/org/elasticsearch/gradle/LoggedExec.java
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@
import org.gradle.api.provider.ListProperty;
import org.gradle.api.provider.MapProperty;
import org.gradle.api.provider.Property;
import org.gradle.api.provider.Provider;
import org.gradle.api.provider.ProviderFactory;
import org.gradle.api.tasks.Input;
import org.gradle.api.tasks.Internal;
import org.gradle.api.tasks.Optional;
Expand Down Expand Up @@ -92,17 +94,45 @@ public abstract class LoggedExec extends DefaultTask implements FileSystemOperat
private String output;

@Inject
public LoggedExec(ProjectLayout projectLayout, ExecOperations execOperations, FileSystemOperations fileSystemOperations) {
public LoggedExec(
ProjectLayout projectLayout,
ExecOperations execOperations,
FileSystemOperations fileSystemOperations,
ProviderFactory providerFactory
) {
this.projectLayout = projectLayout;
this.execOperations = execOperations;
this.fileSystemOperations = fileSystemOperations;
getWorkingDir().convention(projectLayout.getProjectDirectory().getAsFile());
// For now mimic default behaviour of Gradle Exec task here
getEnvironment().putAll(System.getenv());
setupDefaultEnvironment(providerFactory);
getCaptureOutput().convention(false);
getSpoolOutput().convention(false);
}

/**
* We explicitly configure the environment variables that are passed to the executed process.
* This is required to make sure that the build cache and Gradle configuration cache is correctly configured
* can be reused across different build invocations.
* */
private void setupDefaultEnvironment(ProviderFactory providerFactory) {
getEnvironment().putAll(providerFactory.environmentVariablesPrefixedBy("BUILDKITE"));
getEnvironment().putAll(providerFactory.environmentVariablesPrefixedBy("GRADLE_BUILD_CACHE"));
getEnvironment().putAll(providerFactory.environmentVariablesPrefixedBy("VAULT"));
Provider<String> javaToolchainHome = providerFactory.environmentVariable("JAVA_TOOLCHAIN_HOME");
if (javaToolchainHome.isPresent()) {
getEnvironment().put("JAVA_TOOLCHAIN_HOME", javaToolchainHome);
}
Provider<String> javaRuntimeHome = providerFactory.environmentVariable("RUNTIME_JAVA_HOME");
if (javaRuntimeHome.isPresent()) {
getEnvironment().put("RUNTIME_JAVA_HOME", javaRuntimeHome);
}
Provider<String> path = providerFactory.environmentVariable("PATH");
if (path.isPresent()) {
getEnvironment().put("PATH", path);
}
}

@TaskAction
public void run() {
boolean spoolOutput = getSpoolOutput().get();
Expand Down
6 changes: 6 additions & 0 deletions docs/changelog/108360.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
pr: 108360
summary: "ESQL: Fix variable shadowing when pushing down past Project"
area: ES|QL
type: bug
issues:
- 108008
29 changes: 29 additions & 0 deletions docs/changelog/109583.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
pr: 109583
summary: "ESQL: INLINESTATS"
area: ES|QL
type: feature
issues:
- 107589
highlight:
title: "ESQL: INLINESTATS"
body: |-
This adds the `INLINESTATS` command to ESQL which performs a STATS and
then enriches the results into the output stream. So, this query:
[source,esql]
----
FROM test
| INLINESTATS m=MAX(a * b) BY b
| WHERE m == a * b
| SORT a DESC, b DESC
| LIMIT 3
----
Produces output like:
| a | b | m |
| --- | --- | ----- |
| 99 | 999 | 98901 |
| 99 | 998 | 98802 |
| 99 | 997 | 98703 |
notable: true
5 changes: 5 additions & 0 deletions docs/changelog/111118.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
pr: 111118
summary: "[ES|QL] Simplify patterns for subfields"
area: ES|QL
type: bug
issues: []
5 changes: 5 additions & 0 deletions docs/changelog/111123.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
pr: 111123
summary: Add Lucene segment-level fields stats
area: Mapping
type: enhancement
issues: []
5 changes: 5 additions & 0 deletions docs/changelog/111184.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
pr: 111184
summary: Fix Dissect with leading non-ascii characters
area: Ingest Node
type: bug
issues: []
6 changes: 6 additions & 0 deletions docs/changelog/111186.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
pr: 111186
summary: "ES|QL: reduce max expression depth to 400"
area: ES|QL
type: bug
issues:
- 109846
5 changes: 5 additions & 0 deletions docs/changelog/111225.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
pr: 111225
summary: Upgrade Azure SDK
area: Snapshot/Restore
type: upgrade
issues: []
8 changes: 8 additions & 0 deletions docs/reference/cluster/nodes-stats.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -808,6 +808,14 @@ This is not shown for the `shards` level, since mappings may be shared across th
`total_estimated_overhead_in_bytes`::
(integer) Estimated heap overhead, in bytes, of mappings on this node, which allows for 1kiB of heap for every mapped field.

`total_segments`::
(integer) Estimated number of Lucene segments on this node

`total_segment_fields`::
(integer) Estimated number of fields at the segment level on this node

`average_fields_per_segment`::
(integer) Estimated average number of fields per segment on this node
=======
`dense_vector`::
Expand Down
6 changes: 6 additions & 0 deletions docs/reference/esql/esql-commands.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,9 @@ image::images/esql/processing-command.svg[A processing command changing an input
* <<esql-enrich>>
* <<esql-eval>>
* <<esql-grok>>
ifeval::["{release-state}"=="unreleased"]
* experimental:[] <<esql-inlinestats-by>>
endif::[]
* <<esql-keep>>
* <<esql-limit>>
ifeval::["{release-state}"=="unreleased"]
Expand All @@ -59,6 +62,9 @@ include::processing-commands/drop.asciidoc[]
include::processing-commands/enrich.asciidoc[]
include::processing-commands/eval.asciidoc[]
include::processing-commands/grok.asciidoc[]
ifeval::["{release-state}"=="unreleased"]
include::processing-commands/inlinestats.asciidoc[]
endif::[]
include::processing-commands/keep.asciidoc[]
include::processing-commands/limit.asciidoc[]
ifeval::["{release-state}"=="unreleased"]
Expand Down
41 changes: 41 additions & 0 deletions docs/reference/esql/esql-rest.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -278,6 +278,47 @@ POST /_query
----
// TEST[setup:library]

The parameters can be named parameters or positional parameters.

Named parameters use question mark placeholders (`?`) followed by a string.

[source,console]
----
POST /_query
{
"query": """
FROM library
| EVAL year = DATE_EXTRACT("year", release_date)
| WHERE page_count > ?page_count AND author == ?author
| STATS count = COUNT(*) by year
| WHERE count > ?count
| LIMIT 5
""",
"params": [{"page_count" : 300}, {"author" : "Frank Herbert"}, {"count" : 0}]
}
----
// TEST[setup:library]

Positional parameters use question mark placeholders (`?`) followed by an
integer.

[source,console]
----
POST /_query
{
"query": """
FROM library
| EVAL year = DATE_EXTRACT("year", release_date)
| WHERE page_count > ?1 AND author == ?2
| STATS count = COUNT(*) by year
| WHERE count > ?3
| LIMIT 5
""",
"params": [300, "Frank Herbert", 0]
}
----
// TEST[setup:library]

[discrete]
[[esql-rest-async-query]]
==== Running an async {esql} query
Expand Down
Loading

0 comments on commit 73a5f44

Please sign in to comment.