Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Essential task grandpa-voter failed. Shutting down service. Error: Input("Essential task failed.") #560

Closed
hackfisher opened this issue Oct 9, 2020 · 6 comments · Fixed by #567
Assignees

Comments

@hackfisher
Copy link
Contributor

long_text_2020-10-09-10-40-56.txt

@wi1dcard
Copy link
Member

wi1dcard commented Oct 9, 2020

Hint: Only the validator nodes that prepare new blocks got this issue. Other nodes like the full node, or the validator node that doesn't prepare new blocks don't.

@hackfisher
Copy link
Contributor Author

Related: paritytech/substrate#6727

@wi1dcard
Copy link
Member

wi1dcard commented Oct 9, 2020

Workarounds for Too many open files

TLDR: increase the number of the limit of max open files.

Run the node with Docker:

docker run --ulimit nofile=65535:65535 darwinianetwork/darwinia ...

Run the node with Systemd, edit your service unit file, add:

[Service]
LimitNOFILE=65535

Run the node with shell, PM2, or any PAM authenticated user, edit /etc/security/limits.conf, add:

root soft nofile 65535
root hard nofile 65535
* soft nofile 65535
* hard nofile 65535

Check the limit has been increased

$ pidof darwinia # Get the PID of your node process
9290

$ prlimit --pid 9290

RESOURCE   DESCRIPTION                             SOFT       HARD  UNITS
...
NOFILE     max number of open files                65535      65535 files
...

The column HARD shows that 65535 is the current limit.

Get the number of open files

If you're interested in how many file descriptors the process opened:

$ pidof darwinia # Get the PID of your node process
9290
$ ls -1 /proc/9290/fd | wc -l # Get the current number of open files of your node process
323

323 is the number of open files.

@AurevoirXavier
Copy link
Member

@wuminzhe
Copy link
Contributor

docker does not have this problem, the docker image seems to provide a large number: 1048576

@AurevoirXavier AurevoirXavier linked a pull request Oct 14, 2020 that will close this issue
@AurevoirXavier
Copy link
Member

Will be fixed in 0.7.2.

Reopen if needed.

@hackfisher hackfisher unpinned this issue Oct 21, 2020
boundless-forest added a commit that referenced this issue Aug 1, 2023
* Move `EthereumStorageSchema` to primitives

* Delete unused deps
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants