-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When using a custom registry URL in Kibana for EPR we are not seeing the latest System package showing in the Kibana Fleet UI as we expect. #91505
Comments
Pinging @elastic/fleet (Team:Fleet) |
Wouldn't this be a symptom of #91502 if we are not actually in fact using the custom registry URL? I'd like to verify the license type before diving into any other areas. |
Maybe not Jen - though it is a very fair question. I was overly brief perhaps, I had understood the 0.9.2 package was not in production package storage, so since we were seeing it, we have some confidence it is picking up something from the new registry. But why it wasn't picking up the package that was yet even newer than that is not clear. |
Good point. This is a little hard to debug now since 0.9.3 was removed and new system package versions have been deployed since then (across all registry environments). Might it be worth a complete rerun and observe the package versions again? |
We can do that. Let's let the packages updates settle for a few days, we're getting another round updated (which could potentially make debugging hard, exactly all over again) |
@dikshachauhan-qasource I think we should have new Linux / windows packages in staging now so this can be tested again. we can use 7.12-snapshot or 7.12 BC2 build to validate, whichever is more convenient |
Hi @EricDavisX We have performed testing on this ticket and below are our observations: On 7.12 snapshot kibana cloud build
Hence, switched to 7.11.1 snapshot. However, we were facing issues while build deployments from staging URL. All deployments were showing up as unhealthy. So, finally we attempted to validate custom registry URL in Kibana for EPR settings on stack environment and created a stack environment for 7.11.1 snapshot. After accessing Kibana, Windows package version available was 4.1 We will repost our observations on 7.12 snapshot build once agent related issues are resolved. Please let us know if any further details are required. Thanks |
those are the correct package versions I'd expect so, perhaps there was a problem that was somehow transient with the way we remove packages from storage-repo and promote / build new images. I don't have any further recourse to investigate this... if we reproduce it again we can re-open it, or log a new, more specific issue. I hope / expect that most of our testing won't need the custom registry url and local self-managed stack, but it remains an option to test if we ever need to. i'm closing this out, if anyone has concerns please raise them. Thank you for helping to confirm this. |
I am splitting off a concern noted in: elastic/beats#23812
Package storage 'snapshot' branch of the package-storage repo is in use via a custom registry URL and yet still the .0.9.2 System package is showing in the Kibana UI instead of 0.9.3 as we expect.
The 0.9.3 version is available in the actual registry API call:
https://epr-snapshot.elastic.co/search?package=system&all=true
When I look in the storage 'snapshot' branch...
https://github.com/elastic/package-storage/tree/snapshot/packages/system
this shows for me only the 0.9.2, but it was *just merged around 8:30 AM today that the 0.9.3 would be removed to promote it to 'staging' storage. so I'm not sure where our process breaking down here... the snapshot repo should have
The snapshot manifest shows the 0.9.3 package is tied to 7.10, as it should be:
https://github.com/elastic/package-storage/blob/snapshot/packages/system/0.9.3/manifest.yml#L13
And I confirmed we rolled out of the package storage cluster with 'snapshot' before testing on Feb 15th as:
https://beats-ci.elastic.co/job/Ingest-manager/job/release-distribution/47/
The text was updated successfully, but these errors were encountered: