Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] TheHive with Cassandra is extremly slow and has permanent a high CPU Usage #1563

Closed
crackytsi opened this issue Oct 1, 2020 · 4 comments
Assignees
Labels

Comments

@crackytsi
Copy link

Request Type

Bug

Work Environment

Question Answer
OS version (server) Debian
OS version (client) Seven
TheHive version / git hash 4.0 with latest commit e75112b
Package Type From source
Browser type & version Chrome

Problem Description

Using TheHive with arround 8000 Cases is extremly slow.
So for example chaning the case-page takes arround 10 seconds.
From user-experience time thats unacceptable.

It seems that, even basic icons aren't loaded "in-time" so maybe the ability to handle multiple http requests is somehow limited.

Strange is also that the log-file shows a very short time to provide (took <10 ms) but chrome developer times say it takes 10 seconds (!) to download the page e.g <host:http port>/api/v1/query?name=cases.count

Steps to Reproduce

  1. Install TheHive
  2. Generate 8000 Cases with different Observables (arround 10 per Case)
top - 14:12:00 up 113 days,  3:49,  3 users,  load average: 1.47, 1.47, 1.48
Tasks: 195 total,   1 running, 194 sleeping,   0 stopped,   0 zombie
%Cpu0  : 19.4 us,  7.4 sy,  0.0 ni, 72.2 id,  0.0 wa,  0.0 hi,  1.1 si,  0.0 st
%Cpu1  : 17.5 us,  8.2 sy,  0.0 ni, 72.9 id,  0.0 wa,  0.0 hi,  1.4 si,  0.0 st
%Cpu2  :  4.1 us,  3.4 sy,  0.0 ni, 92.4 id,  0.0 wa,  0.0 hi,  0.0 si,  0.0 st
%Cpu3  :  3.0 us,  4.0 sy,  0.0 ni, 93.0 id,  0.0 wa,  0.0 hi,  0.0 si,  0.0 st
%Cpu4  :  8.7 us,  2.8 sy,  0.0 ni, 88.5 id,  0.0 wa,  0.0 hi,  0.0 si,  0.0 st
%Cpu5  : 14.0 us,  5.2 sy,  0.0 ni, 80.8 id,  0.0 wa,  0.0 hi,  0.0 si,  0.0 st
%Cpu6  : 12.2 us,  7.3 sy,  0.0 ni, 79.5 id,  0.0 wa,  0.0 hi,  1.0 si,  0.0 st
%Cpu7  : 11.3 us,  7.4 sy,  0.0 ni, 79.2 id,  0.0 wa,  0.0 hi,  2.1 si,  0.0 st
KiB Mem:  66114116 total, 24979544 used, 41134572 free,   329048 buffers
KiB Swap:  3019772 total,        0 used,  3019772 free.  7226680 cached Mem

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND
29415 cassand+  20   0 11.090g 9.249g 284316 S  92.6 14.7 578:47.23 java
 4346 thehive   20   0 21.941g 2.917g  28892 S  68.4  4.6 240:53.68 java
 1245 root      20   0 1000124  40216  25176 S   1.0  0.1   1511:10 docker-containe
30501 cortex    20   0 22.022g 1.046g  23960 S   1.0  1.7   1054:04 java

From application.log

2020-10-01 14:10:36,688 [INFO] from org.thp.scalligraph.AccessLogFilter in application-akka.actor.default-dispatcher-18 [0000057d|] 10.32.30.26 GET /api/status took 1ms and returned 200 284 bytes
2020-10-01 14:10:38,498 [INFO] from org.thp.scalligraph.AccessLogFilter in application-akka.actor.default-dispatcher-18 [0000057a|] 10.32.30.26 GET /api/stream/nAc1seHC8ceChcUCqIVh took 60021ms and returned 200 2 bytes
2020-10-01 14:11:20,834 [INFO] from org.thp.scalligraph.AccessLogFilter in application-akka.actor.default-dispatcher-18 [0000057f|] 10.32.30.26 GET /api/status took 4ms and returned 200 284 bytes
2020-10-01 14:11:31,828 [INFO] from org.thp.scalligraph.AccessLogFilter in application-akka.actor.default-dispatcher-34 [00000580|] 10.32.30.26 GET /api/status took 1ms and returned 200 284 bytes
2020-10-01 14:11:36,690 [INFO] from org.thp.scalligraph.AccessLogFilter in application-akka.actor.default-dispatcher-15 [00000581|] 10.32.30.26 GET /api/status took 1ms and returned 200 284 bytes
2020-10-01 14:11:38,578 [INFO] from org.thp.scalligraph.AccessLogFilter in application-akka.actor.default-dispatcher-5 [0000057e|] 10.32.30.26 GET /api/stream/nAc1seHC8ceChcUCqIVh took 60020ms and returned 200 2 bytes
2020-10-01 14:12:20,818 [INFO] from org.thp.scalligraph.AccessLogFilter in application-akka.actor.default-dispatcher-15 [00000583|] 10.32.30.26 GET /api/status took 8ms and returned 200 284 bytes
2020-10-01 14:12:31,829 [INFO] from org.thp.scalligraph.AccessLogFilter in application-akka.actor.default-dispatcher-19 [00000584|] 10.32.30.26 GET /api/status took 4ms and returned 200 284 bytes
2020-10-01 14:12:36,712 [INFO] from org.thp.scalligraph.AccessLogFilter in application-akka.actor.default-dispatcher-18 [00000585|] 10.32.30.26 GET /api/status took 6ms and returned 200 284 bytes
2020-10-01 14:12:38,648 [INFO] from org.thp.scalligraph.AccessLogFilter in application-akka.actor.default-dispatcher-19 [00000582|] 10.32.30.26 GET /api/stream/nAc1seHC8ceChcUCqIVh took 60018ms and returned 200 2 bytes

@crackytsi crackytsi added TheHive4 TheHive4 related issues bug labels Oct 1, 2020
@ZeeMastermind
Copy link

ZeeMastermind commented Oct 2, 2020

I also observed this, with less than 100 cases. We were able to restart the machine to fix it, as the machine had been running for a week. I'll try and hunt down some better specs on what happened.

Edit: I do remember the process running at "460% CPU Usage" or something ridiculous, I'm betting there's some memory leakage in the JVM
Edit2: We had 8 cores, 32 GB RAM, and 300 GB of hard disk

@shortstack
Copy link

struggling with slow performance as well. 8 cores, 32g.

< 10 users, usually 2-3 consistently. 4k cases.

thehive4 is dragging :( same load on thehive3 was lightning fast.

@crackytsi
Copy link
Author

Will hopefully be fixed in 4.1 with Indexes :)

@rriclet
Copy link
Contributor

rriclet commented Mar 19, 2021

This issue has been fixed with #1731, included in release 4.1.0.
Feel free to re-open this issue if you still notice slow performance.

@rriclet rriclet closed this as completed Mar 19, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants