-
-
Notifications
You must be signed in to change notification settings - Fork 594
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
VULNERABILITY.SEVERITY
should be updated in database
#2474
Comments
This is how DT knowns if a vuln has an unassigned severity. We could always populate this field with 'Unassigned", so that its no longer null, but it would not change the behavior. Users do not consume the database directly, only the API. If there are irregularities in the API, then thats something we should fix. But the database is not designed to be directly consumed. I don't want to add more logic to this or additional fields to indicate a null value as those would introduce many changes to logic across the backend and frontend that does not benefit the end user. |
@stevespringett We stumpled across this issue when implementing the API for Any alternative ideas how to solve this? |
I feel like the logic of calculating the severity could be shifted to the point in time when the vulnerability record is updated in the database. If we only calculate it at runtime, it's impossible to query by severity, resulting in very inefficient fetching behavior. Whether or not a vuln has an unassigned severity is a check that can be done just as well at "persist-time". We'd have to make sure though that existing data is not negatively affected by this change, i.e. we need either a migration that populates the severity field for all existing vulns, or |
@nscuro is there a negative impact if the current field is just set with a calculated value? Like could the calculation change at some point? Would it be necessary to mark the value as calculated? |
It has not changed for >5 years, looking at the code. However, the calculation is priority-based, where the priorities are:
The score -> severity mappings are clearly defined for both CVSS and OWASP RR, so those won't change. That being said, CVSSv4 is already available, and CycloneDX v1.5 supports it, too. Once DT supports it, the prioritization mentioned above may change such that CVSSv4 will be preferred over CVSSv3.x. But, unless CVSSv4 support is added, we can't ingest it anyway. Which means we will not have to re-compute all the severities in the portfolio, it will still suffice to apply the new priority when a vulnerability record with CVSSv4 rating is updated. Outside of that, if we change the computation, then yes, I guess we would have to recompute all values. But even that would be a one-time thing that we could do either through upgrades, or a task that corrects data in the background. After all, we're talking about a finite number of vulnerability records (~200-400k, depending how many sources are enabled). And again, I don't see this scenario happening really.
I don't currently see any benefit in that. FWIW, I see the following cases when a vulnerability is created / updated:
If, for some reason, we still can't store the calculated severity in the DB, it's still possible to perform the same computation in SQL. It's pretty much a giant |
Ok thank you, in this case I agree the calculation should be moved to the moment the vulnerability is either created or updated from any database, and the calculation stored in the existing field. |
My team is currently working on an implementation for this :) |
Signed-off-by: Ralf King <[email protected]>
Signed-off-by: Ralf King <[email protected]>
Signed-off-by: Ralf King <[email protected]>
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
Current Behavior
If the severity of a
Vulnerability
is NULL in the database, the API will calculate a severity before responding with aVulnerability
object.dependency-track/src/main/java/org/dependencytrack/util/VulnerabilityUtil.java
Lines 75 to 95 in ce2267f
This creates a mismatch between the database and the API responses, because the severity value will still remain NULL in the database, even though a severity for this vulnerability is now known after the calculation.
Example of mismatch between database and API/Frontend
Database entry:
API response:
Frontend:
This mismatch makes correctly filtering the vulnerabilities in the database by severity and responding via API with the filtered list impossible. The calculation cannot be performed by the database and therefore it will filter vulnerabilities that normally would contain the queried severity in the response, because those vulnerabilities do not have a severity in the database.
Proposed Behavior
The severity value of a
Vulnerability
shouldn't be NULL in the database.It should either be updated after calculation to the calculated value or, probably even better, the severity should be calculated before creating a new vulnerability and be directly inserted with the vulnerability in the database to minimize the amount of mismatches and make correctly filtering by severity possible in the database.
Checklist
The text was updated successfully, but these errors were encountered: