-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"invalid memory alloc request size" from PostgreSQL with large query results #4918
Comments
How did you measure the size of the dataset? Anyway, the difference can be because of the serialization used (JSON, each row is an object). Also, I'm not sure this error is about exceeding 1gb, but more about just not having enough memory to allocate.
Of course, it's about the total size of the dataset. Beside the general need to support better large data sets, the issue described here is more about your system configuration than anything in the code. |
I just dumped the table and got the file size. Of course it's more to get a rough idea of how big the dataset is, not an accurate value.
That's what I thought first as I ran through this error, but Redash runs on a public cloud instance with a comfortable 14GB total memory, on Debian buster without memory or other resources limits. |
So, I managed to capture the whole query sent to PostgreSQL. The INSERT is 606MB big. I then tried to replay the INSERT on multiple environments: PostgreSQL 9.6 or 11, alpine/standard image, docker or bare-metal, on other servers. The query fails in all scenarios with the same error. I think we hit a limit somewhere in PostgreSQL. I close the issue since it's only a consequence of #78 |
Issue Summary
I'm not sure if it's directly related to #78, so I start a new issue.
When a query returns too much data, we get an explicit PostgreSQL error when Redash tries to insert the query result into its
query_results
table.Steps to Reproduce
Few seconds later, PostgreSQL returns in its log:
And way after (~1 minute), the UI reports "Error running query: failed communicating with server. Please check your Internet connection and try again."
Note that is not only about the number of rows returned, querying only one column over all the rows works.
I didn't find any PostgreSQL settings to tweak to get through this error. After some investigations, the error comes from palloc functions: https://doxygen.postgresql.org/palloc_8h.html#a7d104f43412595fb2d3ca12963084f23
The maximum memory request size is 1Go - 1: https://doxygen.postgresql.org/memutils_8h.html#a74a92b981e9b6aa591c5fbb24efd1dac
What I still didn't understand is why Redash/PostgreSQL requires more than 1GB memory to insert only 260MB of query result.
Technical details:
The text was updated successfully, but these errors were encountered: