-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Encode Kibana dashboards #466
Comments
Would this mean that package-storage dashboards look different than the ones in integrations? What's the reason to do that? |
Let me explain starting from the end of the flow. Kibana accepts objects (dashboards, visualizations, etc.) partially encoded. Actually, this is up to the Integrations team to choose between readability and consistency and decide which one fits better. |
Refreshing the topic - let's encode dashboards everywhere (integrations, package-storage). Advantages of such approach:
If there're no objections, I will prepare a change management plan for rolling out changes. |
++ on package-storage and registry. One thing to keep in mind for integrations repo: Reviews like this one here spotting wrong queries will not be easily possible if encoded: https://github.com/elastic/integrations/pull/65/files/aff30e36512ce4682e8459e79d53de2642c8790d#r437436596 At least my brain cannot cope with it. Throwing out a "what if" the package-registry would have a config option to be able to serve both options? Without the flag it just assume it is encoded, with the flag it will decode it on the fly? It sounds a bit like out of scope of the registry at the same time, it will also help us migrating to it. The other option is that the registry will detect it on its own if it is decoded or not but for this it would have to check each file before it is served. |
Agree, but the complexity of the existing structure is high enough, so not sure if encoding it significantly changes it.
This is an implementation detail to me, both options (with flag or on-the-fly) are fine. The real question is: preserving we want to keep dashboards decoded in integrations, when in the development process should we encode them? While preparing a PR for the package-storage? |
If we keep them decoded, I would assume it happens during the process that prepares the PR. To keep things moving: Encode it everywhere now and then follow up with integrations (if we want). |
CM:
|
LGTM. Really looking forward to 4 and 6. |
I would like to add some thoughts. Just to confirm my understanding at the moment of the pieces here: We have the source files for packages in the I think that ideally we should have the source files in the format that they are more easily maintained, what in the case of dashboards is tricky. On one side it is true that changes in decoded dashboards are much easier to review, we have experienced that in beats. On the other side, kibana works with partially encoded objects. If kibana was going to be the main developer tool, I would say to go with encoded objects, but we are going to have I am not sure that we can assume that there is not going to be any "build" step between the source files and the served packages. We are going to find reasons to need a step there. For example if we sign packages at some moment (what is a common security practice in package managers), we are going to need to collect the list of checksums of the files included in a package, in this process we could also transform the files that need it.
I don't think we have to choose between readability and consistency. I think that source files should be readable. Consistency should be granted in the sense that from the same source files we should be able to generate consistently equivalent packages to serve.
On |
I appreciate your point of view, Jaime, even though I've an opposite one ;) The consistency between both places is much more valuable to me. The reason why I do not care much about having decoded dashboards is that we've never edited them manually, in 98% cases we edit them using Kibana designer. This 2% covers small fixes, typos, etc. In general the structure of a dashboard is a mystery (all meta properties, doubled fields, etc.). Even if I edit the dashboard manually, I still have to import it to Kibana to verify. I do not insist on this approach, I can live with decoded objects for sure :) I want to highlight the following outcomes of such decision (integrations - decoded, package-storage - encoded):
If this approach sounds more reasonable and convenient to the team, I will modify the above CM. |
In beats we have experienced both worlds, dashboards were not decoded first, and since some time ago we decode them (elastic/beats#7224). In my opinion this was an important improvement in dashboards maintenance. Specially because it made much easier or even possible to make small fixes and to review small and not so small changes. Also because it makes the mysterious structure of the dashboards much more approachable :) Something similar happened with pipelines, originally we had all of them in JSON, and supporting yaml (elastic/beats#11209) was a big improvement for developing and reviewing some of the more complex ones. Elasticsearch only supports pipelines in JSON format, but we use yaml for development, and the tooling (filebeat setup in this case) takes care of the conversions. By the way, what takes care of converting pipelines from yaml to json in packages?
I'd want to remark that my point is not only about decoded vs encoded objects, it is also that I think that having a build step doesn't look unexpected, and we are probably going to need it in any case at some moment. And given that, why doing sacrifices in maintainability?
I will be surprised if we don't find more reasons to need a
But (hopefuly) the integrations repository is not going to be the only source of packages in the future. If some day we have packages from more repositories, or we open the registry to external contributions, we will need a way to prepare the files to be served, this could be a task for |
Thanks Jaime, it's good justification. I wasn't aware of all historical reasons.
I think they're both supported by Kibana.
At the moment, it looks so, but I'm not sure if it evolves in the future ;) My point was to get closer to static files as much as possible, even by eliminating I will modify CM steps to keep dashboards decoded in the integrations repository. |
@jsoriano ES supports JSON and YAML pipeline. It basically supports YAML for everything, just the Beats code didn't ;-) |
Everything is merged, CM is finished. Thanks everyone for discussion. Resolving. |
The
mage build
step for the package-storage project should encode Kibana dashboards.Files in this form will be stored in the package-storage. The package-storage will validate PRs to check if the format is consistent.
The text was updated successfully, but these errors were encountered: