Search doesn't have to be hard. Let the dog do it.
"Bloodhound makes Elasticsearch almost tolerable!" - Almost-gruntled user
"ES is a nightmare but Bloodhound at least makes it tolerable." - Same user, later opinion.
As of version 0.13.0.0, Bloodhound has 2 separate module trees for Elasticsearch versions 1 and 5. Import the module that is appropriate for your use case. If you would like to add support for another major version, open a ticket expressing your intend and follow the pattern used for other versions. We weighed the idea of sharing code between versions but it just got too messy, especially considering the instability of the Elasticsearch API. We switched to a model which would allow the persons responsible for a particular protocol version to maintain that version while avoiding conflict with other versions.
See our TravisCI for a listing of Elasticsearch version we test against.
Bloodhound is stable for production use. I will strive to avoid breaking API compatibility from here on forward, but dramatic features like a type-safe, fully integrated mapping API may require breaking things in the future.
The TravisCI tests are run using Stack. You should use Stack instead of cabal
to build and test Bloodhound to avoid compatibility problems. You will also need to have an Elasticsearch instance running at localhost:9200
in order to execute some of the tests. See the "Version compatibility" section above for a list of Elasticsearch versions that are officially validated against in TravisCI.
Steps to run the tests locally:
- Dig through the [past releases] (https://www.elastic.co/downloads/past-releases) section of the Elasticsearch download page and install the desired Elasticsearch versions.
- Install [Stack] (http://docs.haskellstack.org/en/stable/README.html#how-to-install)
- In your local Bloodhound directory, run
stack setup && stack build
- Start the desired version of Elasticsearch at
localhost:9200
, which should be the default. - Run
stack test
in your local Bloodhound directory. - The unit tests will pass if you re-execute
stack test
. If you want to start with a clean slate, stop your Elasticsearch instance, delete thedata/
folder in the Elasticsearch installation, restart Elasticsearch, and re-runstack test
.
http://hackage.haskell.org/package/bloodhound
It's not using Bloodhound, but if you need an introduction to or overview of Elasticsearch and how to use it, you can use this screencast.
See the examples directory for example code.
- Chris Allen
- Liam Atkinson
- Christopher Guiney
- Curtis Carter
- Michael Xavier
- Bob Long
- Maximilian Tagher
- Anna Kopp
- Matvey B. Aksenov
- Jan-Philip Loos
Beginning here: https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-span-first-query.html
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-function-score-query.html
Might require TCP support.
Pretend to be a transport client?
Might require making a lucene index on disk with the appropriate format.
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-geo-shape-query.html
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-geo-shape-filter.html
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-geohash-cell-filter.html
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-has-child-filter.html
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-has-parent-filter.html
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-indices-filter.html
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-query-filter.html
The Seminearring instance, if deeply nested can possibly produce nested structure that is redundant. Depending on how this affects ES performance, reducing this structure might be valuable.
check for n > 1 occurrences in DFS:
http://hackage.haskell.org/package/stable-maps-0.0.5/docs/System-Mem-StableName-Dynamic.html
http://hackage.haskell.org/package/stable-maps-0.0.5/docs/System-Mem-StableName-Dynamic-Map.html
Photo from HA! Designs: https://www.flickr.com/photos/hadesigns/