Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DOCS] Reformat remove_duplicates token filter #53608

Merged
merged 2 commits into from
Mar 16, 2020
Merged

[DOCS] Reformat remove_duplicates token filter #53608

merged 2 commits into from
Mar 16, 2020

Conversation

jrodewig
Copy link
Contributor

Makes the following changes to the remove_duplicates token filter
docs:

  • Rewrites description and adds Lucene link
  • Adds detailed analyze example
  • Adds custom analyzer example

Makes the following changes to the `remove_duplicates` token filter
docs:

* Rewrites description and adds Lucene link
* Adds detailed analyze example
* Adds custom analyzer example
@jrodewig jrodewig added >docs General docs changes :Search Relevance/Analysis How text is split into tokens labels Mar 16, 2020
@jrodewig jrodewig requested a review from romseygeek March 16, 2020 14:50
@elasticmachine
Copy link
Collaborator

Pinging @elastic/es-docs (>docs)

@elasticmachine
Copy link
Collaborator

Pinging @elastic/es-search (:Search/Analysis)

Copy link
Contributor

@romseygeek romseygeek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@jrodewig
Copy link
Contributor Author

Thanks @romseygeek.

@jrodewig jrodewig merged commit e8ed337 into elastic:master Mar 16, 2020
@jrodewig jrodewig deleted the docs__reformat-remove-dup-tokenfilter branch March 16, 2020 15:21
jrodewig added a commit that referenced this pull request Mar 16, 2020
Makes the following changes to the `remove_duplicates` token filter
docs:

* Rewrites description and adds Lucene link
* Adds detailed analyze example
* Adds custom analyzer example
jrodewig added a commit that referenced this pull request Mar 16, 2020
Makes the following changes to the `remove_duplicates` token filter
docs:

* Rewrites description and adds Lucene link
* Adds detailed analyze example
* Adds custom analyzer example
@jrodewig
Copy link
Contributor Author

Backport commits

master e8ed337
7.x e1eebea
7.6 d0b8535

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
>docs General docs changes :Search Relevance/Analysis How text is split into tokens v7.6.2 v7.7.0 v8.0.0-alpha1
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants