Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory and CPU usage is very high. OOM exception killed Trivy #1295

Closed
hideme4u opened this issue Oct 14, 2021 · 10 comments · Fixed by #1509
Closed

Memory and CPU usage is very high. OOM exception killed Trivy #1295

hideme4u opened this issue Oct 14, 2021 · 10 comments · Fixed by #1509
Assignees
Labels
kind/bug Categorizes issue or PR as related to a bug.

Comments

@hideme4u
Copy link

Description

I am using trivy 0.20.0 for filesystem scan on the entire filesystem.
trivy -v
Version: 0.20.0
Command executed: trivy fs /
Machin configuration: quad-core cpu.
Total filesystem size: 92 GB, Used: 89 GB

After this, I saw Trivy process is consuming Resident memory up to 3GB and CPU usage goes beyond 300% and finally process gets killed (OOM)

What did you expect to happen?

The scan should have been completed and Trivy process should have consumed Memory and CPU within acceptable limit.

What happened instead?

Process start consuming more and more Resident memory and CPU usage remains very high throughout above at least 200%. Maximum all core all consumed. and Finally process gets killed.

trivy fs --timeout 180m /
2021-10-14T16:20:53.384+0530 INFO Need to update DB
2021-10-14T16:20:53.385+0530 INFO Downloading DB...
24.29 MiB / 24.29 MiB [-------------------------------------------------------------------------------------------------------------] 100.00% 5.55 MiB p/s 5s
fatal error: runtime: out of memory

runtime stack:
runtime.throw({0x18cd716, 0xa000000})
/opt/hostedtoolcache/go/1.17.1/x64/src/runtime/panic.go:1198 +0x71
runtime.sysMap(0xc186c00000, 0x429680, 0xc000095e90)
/opt/hostedtoolcache/go/1.17.1/x64/src/runtime/mem_linux.go:169 +0x96
runtime.(*mheap).grow(0x297e3c0, 0x4ef7)
/opt/hostedtoolcache/go/1.17.1/x64/src/runtime/mheap.go:1393 +0x225
runtime.(*mheap).allocSpan(0x297e3c0, 0x4ef7, 0x0, 0x1)
/opt/hostedtoolcache/go/1.17.1/x64/src/runtime/mheap.go:1179 +0x165
runtime.(*mheap).alloc.func1()
/opt/hostedtoolcache/go/1.17.1/x64/src/runtime/mheap.go:913 +0x69
runtime.systemstack()
/opt/hostedtoolcache/go/1.17.1/x64/src/runtime/asm_amd64.s:383 +0x49

(paste your output here)

Output of trivy -v:

trivy -v
Version: 0.20.0

(paste your output here)

Additional details (base image name, container registry info...):

image

@hideme4u hideme4u added the kind/bug Categorizes issue or PR as related to a bug. label Oct 14, 2021
@hideme4u
Copy link
Author

@knqyf263 Is there any performance matrix for CPU and Memory consumption happening in Trivy to which we can refer to? I see CPU and Memory usage going high for filesystem projects. Please let us know your suggestion and guidelines. Thanks !!

@hideme4u
Copy link
Author

@knqyf263 / @santhosh1729 / @masahiro331 Anyone can help with this bug? Thanks !

@afdesk
Copy link
Contributor

afdesk commented Oct 18, 2021

Hi @hideme4u! thanks for your feedback!
I'll try to help you with this issue.
give me some time pls.

@hideme4u
Copy link
Author

hideme4u commented Oct 19, 2021

@afdesk Thanks a lot. Just to add some profiling data for CPU ...

(pprof) top20
Showing nodes accounting for 63.37s, 18.58% of 341.02s total
Dropped 774 nodes (cum <= 1.71s)
Showing top 20 nodes out of 141
flat flat% sum% cum cum%
0.13s 0.038% 0.038% 170.99s 50.14% runtime.systemstack
60.22s 17.66% 17.70% 138.95s 40.75% runtime.scanobject
0.05s 0.015% 17.71% 135.26s 39.66% io.ReadAll
0.01s 0.0029% 17.71% 108.64s 31.86% github.com/aquasecurity/fanal/analyzer.Analyzer.AnalyzeFile.func1
0 0% 17.71% 106.82s 31.32% github.com/aquasecurity/fanal/analyzer/language/java/jar.javaLibraryAnalyzer.Analyze
0 0% 17.71% 106.49s 31.23% github.com/aquasecurity/go-dep-parser/pkg/java/jar.Parse
0.08s 0.023% 17.74% 106.05s 31.10% github.com/aquasecurity/go-dep-parser/pkg/java/jar.parseArtifact
0 0% 17.90% 95.46s 27.99% io/ioutil.ReadAll (inline)
0.01s 0.0029% 17.90% 93.61s 27.45% golang.org/x/sync/errgroup.(*Group).Go.func1
0.01s 0.0029% 17.90% 93.59s 27.44% github.com/saracen/walker.(*walker).gowalk
0.18s 0.053% 17.95% 93.58s 27.44% github.com/saracen/walker.(*walker).readdir
0.11s 0.032% 17.99% 89.50s 26.24% github.com/saracen/walker.(*walker).walk
0 0% 17.99% 75.10s 22.02% github.com/saracen/walker.(*walker).walk.func1
0.09s 0.026% 18.01% 71.32s 20.91% runtime.growslice
1.66s 0.49% 18.50% 70.84s 20.77% runtime.mallocgc
0.15s 0.044% 18.54% 62.44s 18.31% github.com/aquasecurity/fanal/walker.Dir.Walk.func1
0.13s 0.038% 18.58% 52.70s 15.45% github.com/aquasecurity/fanal/artifact/local.Artifact.Inspect.func1

@hideme4u
Copy link
Author

Hi @afdesk Any comment on the above issue

@afdesk
Copy link
Contributor

afdesk commented Oct 20, 2021

hi @hideme4u! thanks a lot! your log is really useful

@afdesk
Copy link
Contributor

afdesk commented Oct 20, 2021

@hideme4u it seems that your instance has a lot of large jar-files.
we have an idea how to fix this issue, and we're testing it.

@hideme4u
Copy link
Author

@afdesk Okay, thanks for the update !. waiting to see the change/fix.

@hyy0322
Copy link

hyy0322 commented Nov 11, 2021

@afdesk hello? Any update?

@knqyf263
Copy link
Collaborator

We're working on it here.
aquasecurity/fanal#314

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Categorizes issue or PR as related to a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants