Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

We should run Fluster through some CI #6

Open
dwlsalmeida opened this issue May 14, 2023 · 10 comments
Open

We should run Fluster through some CI #6

dwlsalmeida opened this issue May 14, 2023 · 10 comments
Labels
enhancement New feature or request

Comments

@dwlsalmeida
Copy link
Collaborator

Apparently, one of my minor changes broke all of h.264 tests. It took a while before it was fixed by 922d678.

We should have some automation in place somehow to run Fluster on every commit.

@Gnurou I do not know much about CI systems in general, but I can do the work if you give some initial directions.

@dwlsalmeida
Copy link
Collaborator Author

Making the latest test scores available through the project's front page would also be interesting. At the very least, it proves how robust the crate is.

@Gnurou
Copy link
Collaborator

Gnurou commented May 15, 2023

Agreed, that would be great. There are a couple of obstacles before we can do that:

  1. Fluster requires to download quite a bit a test data, which we want to avoid doing at every push. Maybe there is a way to cache it though, I think the Github CI provides a way to do that.

  2. Github CI does not expose a VAAPI device, and we won't be able to run Fluster without one as we need to decode valid frames. So we'll probably need to run the CI somewhere else anyway.

@ndufresne
Copy link

About 1. in both Github and Gitlab CI, you will normally download everything static (like the tests vectors) while creating a docker image. This docker image will serve as your cache.

There is nothing you can do about 2 without being able to self host your runners. On Freedesktop Gitlab, we have some specialized runners to avoid the indirection, but we are also sending job into the LAVA labs. There is no requirement to have your project in gitlab to do so, perhaps simply get in touch with them if you want to share some of the infrastructure.

It could certainly be amazing you have an entirely software driven VA driver for testing, one that pass conformance fully, but this also seems like a lot of effort.

@Gnurou Gnurou added the enhancement New feature or request label May 22, 2023
@Gnurou
Copy link
Collaborator

Gnurou commented May 23, 2023

Looks like we will indeed need to self-host our own runners: https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners

I'll try to take a look at that soon, it would be nice to have this in place.

@padovan
Copy link

padovan commented May 23, 2023

For MesaCI, we use Freedesktop GitLab backed by a runner that can use any Chromebook in the Collabora lab. I think we might be talking about having something similar here, but details would need to be discussed with the right people.

I'll follow up on this inside with the Collabora folks involved in this.

@dwlsalmeida
Copy link
Collaborator Author

@Gnurou Hey I've been discussing this internally. It's a good idea to try and upstream your Fluster changes if this is to go forward.

@laura-nao
Copy link

Adding a few more notes from our previous internal discussion below.

As mentioned above, MesaCI uses a similar workflow in Gitlab as the one that would be required here: a custom runner submits test jobs in the devices in our LAVA lab, retrieves and parses the results (in the MesaCI case, that's done both on the open merge requests for pre-merge conformance and on the main trees for post-merge performance tests). In the cros-codecs case, Github actions could be used in a similar way to automate the job submission and to parse the results. LAVA provides XMLRPC and REST API that can be used to submit tests and get the results.

We also have a lava-gitlab-runner that serves as a bridge between Gitlab and LAVA leveraging the LAVA REST API; though this is only for Gitlab, it can serve as a good reference for a Github implementation.

Moreover, testing cros-codecs with fluster would require booting a rootfs on the DUTs with GStreamer and fluster installed. KernelCI already runs some tests that use Debian-based rootfs images with GStreamer and fluster installed; these are automatically built on a weekly basis and the archives could be used for the initial development of the cros-codecs tests.

As we have all these examples we can use to get started, with roughly a week worth of work we could get a very basic workflow in Github where we submit the job to LAVA and get the results back (using pre-built binaries from the KernelCI storage). This basic CI could serve as a POC, but eventually some custom solutions for cros-codecs would be needed, such as:

  • Set up automated build of the test artifacts and find a dedicated storage for them (including static artifacts such as kernel image / dtb). The scripts that build the rootfs images in KernelCI could be reused for this purpose and customized to add the required dependencies (and cros-codecs itself).
  • Parse the LAVA results and handle job failures if needed (e.g. implement job retrying if required)
  • Notify the developers about the test results

@dwlsalmeida
Copy link
Collaborator Author

Thanks @laura-nao !!

@padovan I wonder if there's anyone at Collabora who would be interested in working on this? It's a detour from my current skillset, meanwhile we really must ensure we are not regressing any tests as we further develop cros-codecs

@padovan
Copy link

padovan commented Jul 4, 2023

Yes. This is on my radar. Someone from Collabora will look at the CI work here soon.

@dwlsalmeida
Copy link
Collaborator Author

#48

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants