Skip to content
This repository has been archived by the owner on Sep 3, 2021. It is now read-only.

'Maximum call stack size exceeded' on large schema #172

Closed
wimklerkx opened this issue Jan 8, 2019 · 11 comments
Closed

'Maximum call stack size exceeded' on large schema #172

wimklerkx opened this issue Jan 8, 2019 · 11 comments

Comments

@wimklerkx
Copy link

My schema.graphql is rather large, about 2500 lines without comments

Until version 2.0.1 there were no problems with this schema,
but after updating to v2.1.1 the following error occurs:
RangeError: Maximum call stack size exceeded

Reducing the schema to about 250 lines removes the error.

Increasing the node stack size does not help
When increasing the stack size to 8500
--stack-size=8500
another error takes precedence:
Segmentation fault: 11

Is there a limit to how large a schema can be?
Or is there a way to circumvent this problem when handling large schema's?

@jexp
Copy link
Contributor

jexp commented Jan 8, 2019

Is that in neo4j-graphql-js ?

Or the server plugin?

@jexp jexp transferred this issue from neo4j-graphql/neo4j-graphql Jan 9, 2019
@wimklerkx
Copy link
Author

The error occurs at the makeAugmentedSchema({typeDefs}) function,
which is neo4j-graphql-js

@johnymontana
Copy link
Contributor

@wimklerkx Would it be possible for you to share your schema SDL so we can reproduce the error?

@rdimicheleb
Copy link

I had the same problem, solved disabling auto-generating querys and mutations:

module.exports = augmentSchema(executableSchema, {
    query: false, 
    mutation: false
  });

@wimklerkx
Copy link
Author

@rdimicheleb That would defeat the purpose of the makeAugmentedSchema function
That said, indeed, I'm now headed to writing my own query and mutation resolvers.

@rdimicheleb
Copy link

@wimklerkx you are right, you need to write your own mutations, but I can still query the graph without write every single query in the resolver. I just need to define in the resolver the principals querys using the neo4jgraphql function, example:

resolver:

.
.
.
Query: {
        company: (object, params, context, resolveInfo) => {
            return neo4jgraphql(object, params, context, resolveInfo, process.env.NODE_ENV === 'development')
        },
.
.
.

schema:

type Company{
    name: String
    employees: [Person] @relation(name: "EMPLOYEE_OF", direction:"IN")
}
type Query {
    company (id: ID): Company
}

And I don't need to write any custom resolver for query employees or persons.

I know this issue should be fixed, but the above allowed me to continue working without crash the server.

I Hope you find a solution soon.

@wimklerkx
Copy link
Author

@rdimicheleb Thanks for the elaboration, will do :)

@wimklerkx
Copy link
Author

@johnymontana Here is the large schema causing the problem
schema.txt

@michaeldgraham
Copy link
Collaborator

Hey everyone, this problem was caused when I introduced the use of lodash/cloneDeep while refactoring some of the augmentation code a while back. It was having problems when cloning a type in a large schema. Sorry I didn't catch it before.

Thanks for sharing your schema, @wimklerkx. As of #188, if you're using a larger schema, you should again be able to run the augmentation process on it. If you run into any further problems, feel free to ping me here.

@johnymontana
Copy link
Contributor

The fix for this should be available now in v2.3.0 - please test and let us know if it works.

@wimklerkx
Copy link
Author

Yes, the call stack size error is now gone
Thanks :)

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants