-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support filtering of info files? #48
Comments
Yes, you are correct. It's a little confusing, but there are actually 2 json formats - gcov json and fastcov json. Gcov json is the json that comes directly from gcov. The coverage structure is array-based, meaning performing searches/deletions on files/functions/lines/branches requires iterating over the array to find what you are looking for (or binary search, assuming you sort the arrays first). I wanted to avoid all that complexity, so for that reason I convert it as soon as possible to "fastcov json" which is very similar to gcov json, except it is dictionary-based instead of array-based. This makes the resulting python much simpler - I can check if a source/function/line/branch exists, or delete a source/function/line/branch, etc. without iterating over a list since they are now essentially keys in a map. So, long story short, filtering is currently performed on gcov json, because there is no point in converting/processing it if the user doesn't even want it in the final report. However, I agree with you that it would be nice to allow filtering on existing info files. This would not be hard to add. Basically I'll change this:
to this:
I will probably have some time this coming week to implement |
Currently there is no support for Hopefully, after I add filtering support to combine operations, it will now work as you expect. You'd be able to do:
This would combine the 3 reports, remove coverage on files coming from /usr/include, and then write out the final report |
Makes sense. I didn't mean to suggest adding reading .gcov files, I was referring to the current calling of gcov when you find a .gcno/.gcda. |
Description: - Fix #48 - Filter options now apply during combine operations
I have created an initial implementation on the branch bpg/add_combine_filtering. I need to add a few more functional tests to cover these new scenarios before I open PR, but should be done soonish. |
Not sure if this was intended to be tested yet, but gave it a try. Do I use -f to specify files? Seems most obvious but had problems.
|
Oops, sorry, I just let you know the branch name so you can keep an eye on the changes/progress if you are interested (feel free to test as well, but might not be stable). It's intended to be used with
Should be a valid command. In this case the issue is:
should be
I haven't worked out all the kinks yet; I'll open a PR when I think it is ready. |
|
Description: - Fix #48 - Filter options now apply during combine operations
Description: - Fix #48 - Filter options now apply during combine operations
@wsnyder Can you confirm the latest fastcov.py on master resolves the issue? I've tested with Run with, i.e. |
--include seems to work as expected.
I made patches I'll put in a pull in a moment that I humbly suggest makes the flow closer to what I expected. The remaining problem after applying that fix is I have
Note I asked for "*/verilog.c" to get excluded but then get an error on it. Trying to exclude src/obj_dbg/verilog.c still doesn't work. |
Here's the three issues I see, hopefully all minor as seems close!
|
Realize I lost a filterFastcov in there. It could go back into combineCoverageFiles, or maybe is better in main? Leave it to your preference. |
A few things - fastcov does not currently use regex for filtering, just substring matching. So this would likely work:
(note that because it is substring matching, you need to provide enough to prevent false positives. So if you have a path If I understand your PR correctly, you want fastcov |
I don't really use the combine operation feature, so I'm not a typical user, but if you think it is more intuitive that way, I can certainly make that change. I should also probably update the README so it is more clear what the fastcov pipeline is, and how |
Okay, I've made a new branch bpg/fastcov_flow. This should address everything in the PR. It passes the test fixture, however, I need to still update the README before I open a PR. Feel free to test. |
Looks like it's working and EXCLs also properly handled unlike lcov - much thanks. |
P.S. would suggest instead of throwing an exception when a source file isn't thrown just print a "normal" warning without a backtrace, as think it's a better user experience - a backtrace suggests your code is broken which it isn't. |
Was it branch coverage that dropped, or line coverage? If the former, it could be because you are passing If it is line coverage, please determine if there is any loss of correctness. fastcov does not (or rather, should not) alter the line coverage it gets from gcov (other than with EXCL comments). If you can demonstrate a loss of correctness case, I will definitely fix it (and more importantly, update the test fixture to catch future regressions). And yes, fastcov is very light on try/except - usually I just let the error bubble out if fastcov hits an exceptional situation. I can make fastcov print a warning if an EXCL source file isn't found instead of letting error bubble out. |
Fully working. I had a bug in my script making the coverage delta, all resolved. |
Closing this issue. I created a new issue #53 to track the exclusion warnings enhancement. |
I have been using fastcov -X to make a master .info file, I also use fastcov -C to join infos. Thanks!
I had then in later stages been using a mix of lcov and scripts to process that master info, but am having several problems with lcov, and rather than rewriting my own filter would ideally like to use the filtering already in fastcov.
Basically I'm looking to read the info, process source exclusions, and write an info, something like
But as far as I can tell from peeking at sources there's no way to read an info except for -C, and -C doesn't do source processing. I tried replacing most of the guts of processGcdas with:
but this hack fails and the error suggests that parseInfo doesn't have quite the same internal format as what gcovWorker creates. (if that worked was eventually going to have this conditional on a filename ending in .info. do the parseInfo, then strip that filename from list of files for gcov processing).
I would also be willing to use json format instead of info format if that helps, but don't think that works either.
The naive way I was thinking this worked (until I looked at sources) is you would be able to feed in any number of gcov, info or json, they get gcov'ed (as appropriate), joined (any format), filtered (unless -X), and finally written as .info or .json.
Is there a way to do what I want, or perhaps you would be willing to add it? Thanks
The text was updated successfully, but these errors were encountered: