-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Configurable Index Pattern #69
Comments
Is your topbeat data shipped by an other "shipper" having the exact same document structure? |
Yes, just another index pattern. I could use direct elasticsearch inserts as well (if index-pattern would be configurable in libbeat ..). But the index-pattern problem with the dashboard would remain. So I assume only this line would need to be changed by the load.sh script? |
Wow there is a lot of hard-coded stuff in libbeat and this dashboards. I see the index-pattern is in every "search" with the prefixes topbeat, filebeat,... like in here: https://github.com/elastic/beats-dashboards/blob/master/dashboards/search/Proc-stats.json#L11 |
That would be great; I always found the idea of loading a bunch of json dashboards and blobs into ES to be a bit scary.
|
Indeed the index pattern is hardcoded everywhere in the dashboards. An option would be to replace "topbeat-" with "token-" in all files under beats-dashboards/dashboards: search/System-wide.json, search/Processes.json, search/Proc-stats.json, search/Filesystem-stats.json and index-pattern/topbeat.json. As you can configure in the beats the index pattern, it would be nice to add an option to the load.sh to pass the index pattern name. So, I am setting this issue to a feature request. Thanks for raising this issue! |
Thanks for your quick reply. |
You can also ship to Logstash which can alternate the pattern... |
Hi, |
@radoondas I would suggest to split up this commit into two parts: 1 is cleanup with variables. Best would be to these in. Then I think on line 100 the change we should discuss start. Can you open a separate PR for the first part so the discussion gets more focused on the second part? Probably it is also worth mentioning this Idea here #84 Have a look. |
@ruflin you are right. I was planing it but then I did just one commit. I'll split changes in to 2 different commits which can be then separately discussed. PR will be then very simple request. |
@radoondas I would split it even into two PR's. Like this one can already be merged even if the discussion for the second should take longer. |
Hi, I used 929d684 for some tests and wonder why it is not merged. It did work so far (beside the fact that dashboards did not use .raw fields for strings, and had not the right mapping for topbeat in place, but I assume this was my fault). It would be helpful if the dashboard loader could place the right mapping for topbeat (and other beats), when specifying the data index like --data-index=INDEX-NAME. A little issue is the creation of index-patterns, we have those in place, and I removed then the topbeat, packetbeat, etc. index patterns manually. I think if index-pattern creation runs, it should use the values from .beatconfig, like it is done for searches (run |
Any news on this? I've seen this merge: https://github.com/elastic/beats/pull/2119/files which closes elastic/beats#921 |
@megastef Yes, elastic/beats#2119 adds support to configure the entire index pattern in the Beat configuration file, not only the index base name. We improved the way we import the Beats dashboards in 5.0. Each Beat package comes with the You can try these changes, here. We would love to hear your feedback about it before releasing this solution. |
Hi,
Would it be possible to make the index-pattern configurable?
I store topbeat data with an index pattern TOKEN_YYYY-DD-MM - while token is a prefix for each client/customer/system. In my case topbeat logs are shipped with another log shipper, which can insert any json logs to ES.
What needs to be changed to import the dashboards?
Just change title in dashboards/topbeat.json?
Thanks
Stefan
The text was updated successfully, but these errors were encountered: