-
Notifications
You must be signed in to change notification settings - Fork 147
'Maximum call stack size exceeded' on large schema #172
Comments
Is that in neo4j-graphql-js ? Or the server plugin? |
The error occurs at the makeAugmentedSchema({typeDefs}) function, |
@wimklerkx Would it be possible for you to share your schema SDL so we can reproduce the error? |
I had the same problem, solved disabling auto-generating querys and mutations:
|
@rdimicheleb That would defeat the purpose of the makeAugmentedSchema function |
@wimklerkx you are right, you need to write your own mutations, but I can still query the graph without write every single query in the resolver. I just need to define in the resolver the principals querys using the neo4jgraphql function, example: resolver:
schema:
And I don't need to write any custom resolver for query employees or persons. I know this issue should be fixed, but the above allowed me to continue working without crash the server. I Hope you find a solution soon. |
@rdimicheleb Thanks for the elaboration, will do :) |
@johnymontana Here is the large schema causing the problem |
Hey everyone, this problem was caused when I introduced the use of lodash/cloneDeep while refactoring some of the augmentation code a while back. It was having problems when cloning a type in a large schema. Sorry I didn't catch it before. Thanks for sharing your schema, @wimklerkx. As of #188, if you're using a larger schema, you should again be able to run the augmentation process on it. If you run into any further problems, feel free to ping me here. |
The fix for this should be available now in v2.3.0 - please test and let us know if it works. |
Yes, the call stack size error is now gone |
My schema.graphql is rather large, about 2500 lines without comments
Until version 2.0.1 there were no problems with this schema,
but after updating to v2.1.1 the following error occurs:
RangeError: Maximum call stack size exceeded
Reducing the schema to about 250 lines removes the error.
Increasing the node stack size does not help
When increasing the stack size to 8500
--stack-size=8500
another error takes precedence:
Segmentation fault: 11
Is there a limit to how large a schema can be?
Or is there a way to circumvent this problem when handling large schema's?
The text was updated successfully, but these errors were encountered: