Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Load saved objects from filesystem #2310

Open
coderlol opened this issue Dec 15, 2014 · 32 comments
Open

Load saved objects from filesystem #2310

coderlol opened this issue Dec 15, 2014 · 32 comments
Labels
enhancement New value added to drive a business result Feature:Saved Objects Team:Core Core services & architecture: plugins, logging, config, saved objects, http, ES client, i18n, etc

Comments

@coderlol
Copy link

In Kibana 1.3.x, I could my dashboard javascript scripts in the .../app/dashboards directory directly. How do I do that with Kibana 4? I

have a number of auto-generated dashboards that I need to "upload" to Kibana. The preferred means of uploading is to drop the .js into the app/dashboards directory. Would that be possible with Kibanana 4?

@rashidkpc
Copy link
Contributor

Dashboards are stored exclusively in elasticsearch for the time being. You can take a look at the .kibana index, or in Settings -> Objects if you'd like to see the format.

@rashidkpc rashidkpc changed the title Kibana 4 Beta 2 - Location of dashboard scripts? Load dashboards from filesystem Dec 15, 2014
@coderlol
Copy link
Author

It possible with the current Kibana to load dashboards from the filesystem, from .../app/dashboards. We have quite a few dashboards developed using templates and javascripts. What are we going to do with them in Kibana 4?

@ranji2612
Copy link

"Dashboards are stored exclusively in elasticsearch for the time being." .. Is it intended to change in the future ?? ?...
I'm currently creating individual searches and indices using scripts which add the data to ElasticSearch thereby automatically making them available in Kibana...

@coderlol
Copy link
Author

Can you share how you upload your queries straight into ElasticSearch? Would that be possible to upload Kibana 3 dashboards into ElasticSearch and have Kibana 4 making use of them?

@ranji2612
Copy link

@coderlol To create Index pattern :

This includes all the fields so its big, you can customize based on what you need and what you dont,,,

curl -XPUT 'http://localhost:9200/.kibana/index-pattern/_guid_-logstash-' -d '{"title":"guid-logstash-","timeFieldName":"timestamp","customFormats":"{}","fields":"[{"type":"string","indexed":false,"analyzed":false,"name":"_source","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"dsid","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"type","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"@Version","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"ecid","count":0},{"type":"date","indexed":true,"analyzed":false,"doc_values":false,"name":"timestamp","count":0},{"type":"string","indexed":true,"analyzed":false,"name":"_type","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"level","count":0},{"type":"string","indexed":true,"analyzed":false,"name":"_id","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"userId","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"path","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"tid","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"logMessage","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"application","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"module","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"host","count":0},{"type":"string","indexed":false,"analyzed":false,"name":"_index","count":4},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"messageType","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"message","count":0},{"type":"date","indexed":true,"analyzed":false,"doc_values":false,"name":"@timestamp","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"server","count":0},{"type":"string","indexed":true,"analyzed":true,"doc_values":false,"name":"messageId","count":0}]"}'

to create Search

curl -XPUT 'http://localhost:9200/.kibana/search/_guid_' -d '{"title":"guid","description":"Search panel for guid Logs","hits":0,"columns":["_type","messageType","application","logMessage","module"],"kibanaSavedObjectMeta":{"searchSourceJSON":"{"query":{"query_string":{"query":""}},"filter":[],"index":"guid-logstash-"}"}}'

A similar way could be done for dashboards too..

@coderlol
Copy link
Author

I appreciate that. Thanks...alas, my dashboards are javascripts and not templates, so not sure if that will work in Kibana 4.

@coderlol
Copy link
Author

@ranji2612, this works well. It's a bit of a pain, but it works. Thanks again.

@ranji2612
Copy link

@coderlol My scenario was creating more Searches and Indexes with different names based on a fixed template.. The JSON for the template can be obtained from the edit index page or edit searches based on existing ones... Just copy pasted them 👍 Can you add some .js of yours here if possible, i'd like to have a look...

@coderlol
Copy link
Author

Here you go ...
{
"title": "overview-all",
"description": "",
"hits": 0,
"columns": ["log", "thread", "source", "method", "data", "path", "host"],
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{"query":{"query_string":{"query":""}},"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"":{}}},"filter":[],"index":"logstash-*"}"
}
}

@coderlol
Copy link
Author

@ranji2612, my workflow is as follows:

  1. Go to Kibana, create searches, visualizations, dashboards, etc
  2. Go to elasticsearch, extract the js files
  3. Use curl to re-upload the js files to elasticserach (at a later time or for a different installation)

@apogre
Copy link

apogre commented Mar 16, 2015

*New URL format eliminates the need for templated and scripted dashboards

@lgoldstein
Copy link

I agree with @coderlol - What are we going to do with them in Kibana 4? - why is this issue defined as an "enhancement" when it is obviously a regression - it used to work and no longer works. Which also brings us to issue #3486 - why is it closed ? After all, it isn't fixed and it also is a regression of a (very useful) feature that used to work and now no longer works.

P.S. if you are trying to make Kibana popular you are going about it the wrong way...

@davux
Copy link

davux commented Apr 23, 2015

Agreed with @lgoldstein. This is not an enhancement request at all, it's a bug report as it used to work and now it doesn't. If Kibana 4 is not marked as Beta anymore, then it should provide at least the same functionalities as Kibana 3.

@lambodar
Copy link

Team,
Any idea when this future will release, Fyi currently I am using Kabana-4.1.0?

@bounddog
Copy link

bounddog commented Sep 7, 2015

I'm looking for the solution too, but seems there is no good solution so far.
The visualizations were designed base on Index patterns, current Kibana 4.1.1 only support to export/import visualizations and dashboards objects, what's about those index patterns?
We have to backup all the .kibana index and restore the whole index on a new production environment.

@rashidkpc rashidkpc added v5.1.0 and removed v4.5.0 labels Nov 23, 2015
@simianhacker simianhacker added P3 and removed v5.1.0 labels Jan 5, 2016
@tbragin tbragin added Feature:Dashboard Dashboard related features :Sharing labels Nov 9, 2016
@f15a
Copy link

f15a commented Dec 9, 2016

hi everyone,
do you know if it is possible to load/refresh a shared dashboards on kibana 5 automatically every period of time (e.g. 1h) ?
-my problem yet: how push the loaded shared dashboard via URL to the RIGHT folder where Kibana 5 loads the dashboard on debian or linux.

regads

@ppf2
Copy link
Member

ppf2 commented Nov 22, 2017

+1 Given that we have started encouraging users to use this new API, we will want to document this (#11632), as well as #14872. If it's experimental, we can mark it as such.

@szechyjs
Copy link

It would be great if there was at least a CLI interface to this API so that a JSON file could easily be imported. A common use case that we have from Kibana 3 was building prepackaged dashboards in our system installs, this was a simple as making sure a file got placed in the correct directory, using tools like Docker or Omnibus. It was also easy to update these dashboard because all we had to do was ship a new file.

@awh
Copy link

awh commented Mar 20, 2018

We have the ability to import/export dashboards, both through the UI and through an API. Is that sufficient for your use cases?

@stacey-gammon It's sufficient, but far from ideal for anyone wanting to do revision control and configuration management of their dashboards, index patterns etc. As convenient and clean as this decision is from your perspective as an implementer, anyone wanting to use PaaS automation and leverage orchestrator features such as Kubernetes ConfigMaps is going to have to kludge around it to avoid annoying one-off manual configuration steps in their flow.

Grafana has gone through a similar evolution, where for example it was only possible to configure a datasource through the UI; we ended up having to create a separate service that continually pokes the desired config into the API endpoints. Thankfully now it's possible to load dashboards and datasources as configuration from the filesystem (grafana/grafana#9504) and we can throw that away.

So yes, whilst it is of course possible for end-users with these requirements to individually cobble together scripts to do once-and-only once initial configuration and watch the filesystem for changes and push them into your API endpoints, I would respectfully submit that it would be better for you to do that yourselves, and ship it with Kibana for everyone to use, so that we can all benefit from the same bugfixes and the solution being kept in sync with upstream changes.

Ideally we would have the following items read from static configuration in the filesystem, watched for changes:

  • All advanced settings
  • Index patterns which can be defined with well-known names instead of random IDs
  • Visualisation & dashboard JSON bound to the index patterns above by name
  • Searches

At this point one can keep all of the above in git in a Kubernetes ConfigMap or equivalent and have it mounted into the filesystem of the Kibana Docker container by the orchestrator at the end of a CD pipeline. The UI stuff is great for playing around and exploring, but when it comes to production deployments strong versioning and repeatability are paramount - let us use the tools we already have to obtain them.

Thankyou for a great product, and for listening!

@sandstrom
Copy link

sandstrom commented May 25, 2018

@spalger @w33ble @ppisljar Would be great to hear your thoughts on this matter!

With docker, kubernetes and the widespread use of chef/puppet/ansible/salt being able to configure these things via the filesystem would be very useful! 😄

@ppisljar
Copy link
Member

@awh just to be sure, we are talking about loading dashboards/searches/... directly from filesystem (instead of .kibana index in elasticsearch), and not just about adding a CLI interface to our saved objects API ?

also, you don't want the objects read from this static files to be indexed in elasticsearch ? (that's what you could achieve already with our saved object api and a simple custom script)

I can see this could be very useful, also I don't see (atm) any reasons why we shouldn't do this? @stacey-gammon @spalger ?

@spalger
Copy link
Contributor

spalger commented May 25, 2018

Having saved objects that are not stored in Elasticsearch means that some objects would be loaded from outside the SavedObject client, or that we would need to load objects from disk and Elasticsearch in the SavedObject client.

I don't like either of those options and would much rather Kibana offered the ability to automatically index dashboards that were found on disk. If they were required to specify an ID or something then Kibana could know if they already existed and not wipe out changes or anything. This would still probably conflict with #18473

cc: @kobelb

@sandstrom
Copy link

sandstrom commented May 28, 2018

(my point of view may not reflect the needs of others who've expressed interest in the functionality discussed here)

@spalger For us, automatically adding index-patterns (and dashboards) found on disk would work well. By philosophy we have immutable infrastructure (i.e. no long-running chef or similar, nodes are only setup once).

Our current problem is that we need to use complicated polling scripts in bash that wait for kibana nodes to bootup completely, and then inject the needed index patterns. This is brittle (sometimes the kibana node won't be booted completely in time for our bash script).

Auto-adding index-patterns from disk (based on a config-file) would totally solve our problem. Having an ID to avoid overrides sounds like a good idea.

@epixa
Copy link
Contributor

epixa commented Jun 16, 2018

I see this as essentially the same thing as import/export, and I would expect the format of these imports to be exactly the same. The only addition here is that Kibana would somehow discover exports from the local file system and import them automatically.

The open question to me is to what extent we should handle changes in those exports. @spalger mentioned one possible way by tracking things by id, but other people in this thread specifically mentioned the need to track changes.

@epixa
Copy link
Contributor

epixa commented Jun 16, 2018

I'm re-categorizing this as platform since we'd implement this as a generic saved object thing rather than exclusive to dashboards.

@epixa epixa added Team:Core Core services & architecture: plugins, logging, config, saved objects, http, ES client, i18n, etc enhancement New value added to drive a business result and removed Feature:Dashboard Dashboard related features :Sharing release_note:enhancement labels Jun 16, 2018
@epixa epixa changed the title Load dashboards from filesystem Load saved objects from filesystem Jun 16, 2018
@TagadaPoe
Copy link

I also am interested in this feature, which would allow me to define dashboards using a Kubernetes ConfigMap, just like I do with Grafana.

@Securitybits-io
Copy link

This would be awesome for me aswell as i am figuring out on howto import from filesystem using Salt! would be alot easier to just file.managed a file onto the filesystem then have elastic import this.

@turtlemonvh
Copy link

@Securitybits-io we use Salt to manage Kibana, and the approach I mentioned above is what I designed to handle this problem: #2310 (comment)

@Securitybits-io
Copy link

@turtlemonvh
Yeah that would work, and in salt you can put that curl in a cmd.run state file..

Problem is that your script seems to be stateless, as in it will run every time you apply a state file. and re-import unless collision.

theres also the use case in which theres an update to the dashboard. taking grafana for example. you drop a basic dashboard into /var/lib/grafana/dashboards, this will load any dashboard identified as new, or changed.. having the same functionality for ELK would be awesome.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New value added to drive a business result Feature:Saved Objects Team:Core Core services & architecture: plugins, logging, config, saved objects, http, ES client, i18n, etc
Projects
None yet
Development

No branches or pull requests