Skip to content

Commit

Permalink
Merge branch 'main' of https://github.com/elastic/elasticsearch into …
Browse files Browse the repository at this point in the history
…add-kql-query
  • Loading branch information
afoucret committed Nov 8, 2024
2 parents 0992c3a + 35c6b60 commit 5ae733e
Show file tree
Hide file tree
Showing 31 changed files with 713 additions and 90 deletions.
59 changes: 27 additions & 32 deletions README.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Elasticsearch is a distributed search and analytics engine, scalable data store

Use cases enabled by Elasticsearch include:

* https://www.elastic.co/search-labs/blog/articles/retrieval-augmented-generation-rag[Retrieval Augmented Generation (RAG)]
* https://www.elastic.co/search-labs/blog/articles/retrieval-augmented-generation-rag[Retrieval Augmented Generation (RAG)]
* https://www.elastic.co/search-labs/blog/categories/vector-search[Vector search]
* Full-text search
* Logs
Expand All @@ -17,7 +17,7 @@ Use cases enabled by Elasticsearch include:
To learn more about Elasticsearch's features and capabilities, see our
https://www.elastic.co/products/elasticsearch[product page].

To access information on https://www.elastic.co/search-labs/blog/categories/ml-research[machine learning innovations] and the latest https://www.elastic.co/search-labs/blog/categories/lucene[Lucene contributions from Elastic], more information can be found in https://www.elastic.co/search-labs[Search Labs].
To access information on https://www.elastic.co/search-labs/blog/categories/ml-research[machine learning innovations] and the latest https://www.elastic.co/search-labs/blog/categories/lucene[Lucene contributions from Elastic], more information can be found in https://www.elastic.co/search-labs[Search Labs].

[[get-started]]
== Get started
Expand All @@ -27,20 +27,20 @@ https://www.elastic.co/cloud/as-a-service[Elasticsearch Service on Elastic
Cloud].

If you prefer to install and manage Elasticsearch yourself, you can download
the latest version from
the latest version from
https://www.elastic.co/downloads/elasticsearch[elastic.co/downloads/elasticsearch].

=== Run Elasticsearch locally

////
////
IMPORTANT: This content is replicated in the Elasticsearch repo. See `run-elasticsearch-locally.asciidoc`.
Ensure both files are in sync.
https://github.com/elastic/start-local is the source of truth.
////
////

[WARNING]
====
====
DO NOT USE THESE INSTRUCTIONS FOR PRODUCTION DEPLOYMENTS.
This setup is intended for local development and testing only.
Expand Down Expand Up @@ -93,20 +93,20 @@ Use this key to connect to Elasticsearch with a https://www.elastic.co/guide/en/
From the `elastic-start-local` folder, check the connection to Elasticsearch using `curl`:

[source,sh]
----
----
source .env
curl $ES_LOCAL_URL -H "Authorization: ApiKey ${ES_LOCAL_API_KEY}"
----
// NOTCONSOLE

=== Send requests to Elasticsearch

You send data and other requests to Elasticsearch through REST APIs.
You can interact with Elasticsearch using any client that sends HTTP requests,
You send data and other requests to Elasticsearch through REST APIs.
You can interact with Elasticsearch using any client that sends HTTP requests,
such as the https://www.elastic.co/guide/en/elasticsearch/client/index.html[Elasticsearch
language clients] and https://curl.se[curl].
language clients] and https://curl.se[curl].

==== Using curl
==== Using curl

Here's an example curl command to create a new Elasticsearch index, using basic auth:

Expand Down Expand Up @@ -149,19 +149,19 @@ print(client.info())

==== Using the Dev Tools Console

Kibana's developer console provides an easy way to experiment and test requests.
Kibana's developer console provides an easy way to experiment and test requests.
To access the console, open Kibana, then go to **Management** > **Dev Tools**.

**Add data**

You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs.
Whether you have structured or unstructured text, numerical data, or geospatial data,
Elasticsearch efficiently stores and indexes it in a way that supports fast searches.
You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs.
Whether you have structured or unstructured text, numerical data, or geospatial data,
Elasticsearch efficiently stores and indexes it in a way that supports fast searches.

For timestamped data such as logs and metrics, you typically add documents to a
data stream made up of multiple auto-generated backing indices.

To add a single document to an index, submit an HTTP post request that targets the index.
To add a single document to an index, submit an HTTP post request that targets the index.

----
POST /customer/_doc/1
Expand All @@ -171,19 +171,19 @@ POST /customer/_doc/1
}
----

This request automatically creates the `customer` index if it doesn't exist,
adds a new document that has an ID of 1, and
This request automatically creates the `customer` index if it doesn't exist,
adds a new document that has an ID of 1, and
stores and indexes the `firstname` and `lastname` fields.

The new document is available immediately from any node in the cluster.
The new document is available immediately from any node in the cluster.
You can retrieve it with a GET request that specifies its document ID:

----
GET /customer/_doc/1
----

To add multiple documents in one request, use the `_bulk` API.
Bulk data must be newline-delimited JSON (NDJSON).
Bulk data must be newline-delimited JSON (NDJSON).
Each line must end in a newline character (`\n`), including the last line.

----
Expand All @@ -200,15 +200,15 @@ PUT customer/_bulk

**Search**

Indexed documents are available for search in near real-time.
The following search matches all customers with a first name of _Jennifer_
Indexed documents are available for search in near real-time.
The following search matches all customers with a first name of _Jennifer_
in the `customer` index.

----
GET customer/_search
{
"query" : {
"match" : { "firstname": "Jennifer" }
"match" : { "firstname": "Jennifer" }
}
}
----
Expand All @@ -223,9 +223,9 @@ data streams, or index aliases.

. Go to **Management > Stack Management > Kibana > Data Views**.
. Select **Create data view**.
. Enter a name for the data view and a pattern that matches one or more indices,
such as _customer_.
. Select **Save data view to Kibana**.
. Enter a name for the data view and a pattern that matches one or more indices,
such as _customer_.
. Select **Save data view to Kibana**.

To start exploring, go to **Analytics > Discover**.

Expand Down Expand Up @@ -254,11 +254,6 @@ To build a distribution for another platform, run the related command:
./gradlew :distribution:archives:windows-zip:assemble
----

To build distributions for all supported platforms, run:
----
./gradlew assemble
----

Distributions are output to `distribution/archives`.

To run the test suite, see xref:TESTING.asciidoc[TESTING].
Expand All @@ -281,7 +276,7 @@ The https://github.com/elastic/elasticsearch-labs[`elasticsearch-labs`] repo con
[[contribute]]
== Contribute

For contribution guidelines, see xref:CONTRIBUTING.md[CONTRIBUTING].
For contribution guidelines, see xref:CONTRIBUTING.md[CONTRIBUTING].

[[questions]]
== Questions? Problems? Suggestions?
Expand Down
25 changes: 18 additions & 7 deletions build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,13 @@ import com.avast.gradle.dockercompose.tasks.ComposePull
import com.fasterxml.jackson.databind.JsonNode
import com.fasterxml.jackson.databind.ObjectMapper

import org.elasticsearch.gradle.DistributionDownloadPlugin
import org.elasticsearch.gradle.Version
import org.elasticsearch.gradle.internal.BaseInternalPluginBuildPlugin
import org.elasticsearch.gradle.internal.ResolveAllDependencies
import org.elasticsearch.gradle.internal.info.BuildParams
import org.elasticsearch.gradle.util.GradleUtils
import org.gradle.plugins.ide.eclipse.model.AccessRule
import org.gradle.plugins.ide.eclipse.model.ProjectDependency
import org.elasticsearch.gradle.DistributionDownloadPlugin

import java.nio.file.Files

Expand Down Expand Up @@ -89,7 +88,7 @@ class ListExpansion {

// Filters out intermediate patch releases to reduce the load of CI testing
def filterIntermediatePatches = { List<Version> versions ->
versions.groupBy {"${it.major}.${it.minor}"}.values().collect {it.max()}
versions.groupBy { "${it.major}.${it.minor}" }.values().collect { it.max() }
}

tasks.register("updateCIBwcVersions") {
Expand All @@ -101,7 +100,10 @@ tasks.register("updateCIBwcVersions") {
}
}

def writeBuildkitePipeline = { String outputFilePath, String pipelineTemplatePath, List<ListExpansion> listExpansions, List<StepExpansion> stepExpansions = [] ->
def writeBuildkitePipeline = { String outputFilePath,
String pipelineTemplatePath,
List<ListExpansion> listExpansions,
List<StepExpansion> stepExpansions = [] ->
def outputFile = file(outputFilePath)
def pipelineTemplate = file(pipelineTemplatePath)

Expand Down Expand Up @@ -132,7 +134,12 @@ tasks.register("updateCIBwcVersions") {
// Writes a Buildkite pipeline from a template, and replaces $BWC_STEPS with a list of steps, one for each version
// Useful when you need to configure more versions than are allowed in a matrix configuration
def expandBwcSteps = { String outputFilePath, String pipelineTemplatePath, String stepTemplatePath, List<Version> versions ->
writeBuildkitePipeline(outputFilePath, pipelineTemplatePath, [], [new StepExpansion(templatePath: stepTemplatePath, versions: versions, variable: "BWC_STEPS")])
writeBuildkitePipeline(
outputFilePath,
pipelineTemplatePath,
[],
[new StepExpansion(templatePath: stepTemplatePath, versions: versions, variable: "BWC_STEPS")]
)
}

doLast {
Expand All @@ -150,7 +157,11 @@ tasks.register("updateCIBwcVersions") {
new ListExpansion(versions: filterIntermediatePatches(BuildParams.bwcVersions.unreleasedIndexCompatible), variable: "BWC_LIST"),
],
[
new StepExpansion(templatePath: ".buildkite/pipelines/periodic.bwc.template.yml", versions: filterIntermediatePatches(BuildParams.bwcVersions.indexCompatible), variable: "BWC_STEPS"),
new StepExpansion(
templatePath: ".buildkite/pipelines/periodic.bwc.template.yml",
versions: filterIntermediatePatches(BuildParams.bwcVersions.indexCompatible),
variable: "BWC_STEPS"
),
]
)

Expand Down Expand Up @@ -302,7 +313,7 @@ allprojects {
if (project.path.startsWith(":x-pack:")) {
if (project.path.contains("security") || project.path.contains(":ml")) {
tasks.register('checkPart4') { dependsOn 'check' }
} else if (project.path == ":x-pack:plugin" || project.path.contains("ql") || project.path.contains("smoke-test")) {
} else if (project.path == ":x-pack:plugin" || project.path.contains("ql") || project.path.contains("smoke-test")) {
tasks.register('checkPart3') { dependsOn 'check' }
} else if (project.path.contains("multi-node")) {
tasks.register('checkPart5') { dependsOn 'check' }
Expand Down
6 changes: 6 additions & 0 deletions docs/changelog/114484.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
pr: 114484
summary: Add `docvalue_fields` Support for `dense_vector` Fields
area: Search
type: enhancement
issues:
- 108470
5 changes: 5 additions & 0 deletions docs/changelog/116437.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
pr: 116437
summary: Ensure class resource stream is closed in `ResourceUtils`
area: Indices APIs
type: enhancement
issues: []
Original file line number Diff line number Diff line change
Expand Up @@ -66,10 +66,11 @@ indices that were created from the auto-follow pattern.
On the local cluster:

. Enhance any roles used by local cluster users with the required
<<roles-remote-indices-priv,remote indices privileges>> for {ccr} and {ccs}.
<<roles-remote-indices-priv,remote indices privileges>> or
<<roles-remote-cluster-priv, remote cluster privileges>> for {ccr} and {ccs}.
Refer to <<remote-clusters-privileges-api-key>>. Note:

** You only need to assign additional `remote_indices` privileges to existing
** You only need to assign additional `remote_indices` or `remote_cluster` privileges to existing
roles used for cross-cluster operations. You should be able to copy these
privileges from the original roles on the remote cluster, where they are defined
under the certification based security model.
Expand Down Expand Up @@ -197,7 +198,7 @@ authentication.
Resume any persistent tasks that you stopped earlier. Tasks should be restarted
by the same user or API key that created the task before the migration. Ensure
the roles of this user or API key have been updated with the required
`remote_indices` privileges. For users, tasks capture the caller's credentials
`remote_indices` or `remote_cluster` privileges. For users, tasks capture the caller's credentials
when started and run in that user's security context. For API keys, restarting a
task will update the task with the updated API key.

Expand Down Expand Up @@ -246,7 +247,7 @@ If you need to roll back, follow these steps on the local cluster:
. Remove the remote cluster definition by setting the remote cluster settings to
`null`.

. Remove the `remote_indices` privileges from any roles that were updated during
. Remove the `remote_indices` or 'remote_cluster' privileges from any roles that were updated during
the migration.

. On each node, remove the `remote_cluster_client.ssl.*` settings from
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -399,7 +399,7 @@ This does not show up in any logs.

====== Resolution

. Check that the local user has the necessary `remote_indices` privileges. Grant sufficient `remote_indices` privileges if necessary.
. Check that the local user has the necessary `remote_indices` or `remote_cluster` privileges. Grant sufficient `remote_indices` or `remote_cluster` privileges if necessary.
. If permission is not an issue locally, ask the remote cluster administrator to
create and distribute a
<<security-api-create-cross-cluster-api-key,cross-cluster API key>>. Replace the
Expand Down
14 changes: 13 additions & 1 deletion docs/reference/rest-api/security/bulk-create-roles.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ that begin with `_` are reserved for system usage.
For more information, see
<<run-as-privilege>>.
`remote_indices`:: beta:[] (list) A list of remote indices permissions entries.
`remote_indices`:: (list) A list of remote indices permissions entries.
+
--
NOTE: Remote indices are effective for <<remote-clusters-api-key,remote clusters configured with the API key based model>>.
Expand All @@ -94,6 +94,18 @@ have on the specified indices.
read access to. A document within the specified indices must match this query in
order for it to be accessible by the owners of the role.
`remote_cluster`:: (list) A list of remote cluster permissions entries.
+
--
NOTE: Remote cluster permissions are effective for <<remote-clusters-api-key,remote clusters configured with the API key based model>>.
They have no effect for remote clusters configured with the <<remote-clusters-cert,certificate based model>>.
--
`clusters` (required)::: (list) A list of cluster aliases to which the permissions
in this entry apply.
`privileges`(required)::: (list) The cluster level privileges that the owners of the role
have in the specified clusters.
For more information, see <<defining-roles>>.
====

Expand Down
11 changes: 11 additions & 0 deletions docs/reference/rest-api/security/create-roles.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,17 @@ have on the specified indices.
read access to. A document within the specified indices must match this query in
order for it to be accessible by the owners of the role.

`remote_cluster`:: (list) A list of remote cluster permissions entries.
+
--
NOTE: Remote cluster permissions are effective for <<remote-clusters-api-key,remote clusters configured with the API key based model>>.
They have no effect for remote clusters configured with the <<remote-clusters-cert,certificate based model>>.
--
`clusters` (required)::: (list) A list of cluster aliases to which the permissions
in this entry apply.
`privileges`(required)::: (list) The cluster level privileges that the owners of the role
have in the specified clusters.

For more information, see <<defining-roles>>.

[[security-api-put-role-example]]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@
=== Configure roles and users

To use a remote cluster for {ccr} or {ccs}, you need to create user roles with
<<roles-remote-indices-priv,remote indices privileges>> on the local cluster.
<<roles-remote-indices-priv,remote indices privileges>> or
<<roles-remote-cluster-priv, remote cluster privileges>> on the local cluster.

You can manage users and roles from Stack Management in {kib} by selecting
*Security > Roles* from the side navigation. You can also use the
Expand Down Expand Up @@ -80,7 +81,7 @@ POST /_security/role/remote-search
"privileges": [
"read",
"read_cross_cluster",
"view_index_metadata"
"view_index_metadata"
]
}
]
Expand Down
15 changes: 15 additions & 0 deletions muted-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -273,6 +273,21 @@ tests:
- class: org.elasticsearch.xpack.test.rest.XPackRestIT
method: test {p0=ml/jobs_crud/Test put job deprecated bucket span}
issue: https://github.com/elastic/elasticsearch/issues/116419
- class: org.elasticsearch.xpack.test.rest.XPackRestIT
method: test {p0=ml/explain_data_frame_analytics/Test both job id and body}
issue: https://github.com/elastic/elasticsearch/issues/116433
- class: org.elasticsearch.smoketest.MlWithSecurityIT
method: test {yaml=ml/inference_crud/Test force delete given model with alias referenced by pipeline}
issue: https://github.com/elastic/elasticsearch/issues/116443
- class: org.elasticsearch.xpack.test.rest.XPackRestIT
method: test {p0=esql/60_usage/Basic ESQL usage output (telemetry) non-snapshot version}
issue: https://github.com/elastic/elasticsearch/issues/116448
- class: org.elasticsearch.xpack.downsample.ILMDownsampleDisruptionIT
method: testILMDownsampleRollingRestart
issue: https://github.com/elastic/elasticsearch/issues/114233
- class: org.elasticsearch.xpack.test.rest.XPackRestIT
method: test {p0=ml/data_frame_analytics_crud/Test put config with unknown field in outlier detection analysis}
issue: https://github.com/elastic/elasticsearch/issues/116458

# Examples:
#
Expand Down
Loading

0 comments on commit 5ae733e

Please sign in to comment.