-
-
Notifications
You must be signed in to change notification settings - Fork 481
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance issues with large forms #336
Comments
I managed to optimize this by limiting the subscriptions and pausing the validation but I still have an irreducible problem during the unmount, @erikras do you have an idea maybe? |
Note that I am seeing validations run on unmount - while not necessarily the same case it is related to performance of large or complex forms with validations #408 |
Maybe it's better pass a validateOnBlur prop, I didn't try it but i think it could be effective |
The validation has to run whenever a new field is added, but on initial render, there is a mechanism that should (perhaps it's broken here?) pause all validation from running until the entire form has been rendered (i.e. all fields registered) and then runs validation. It'll require some investigation... |
I'm having the same performance issue when dealing with large forms. Any field change will cause the whole form re-render, deep into every field. Any ideas? Thanks, |
@lazurey Yep, that's by design because it's easier out of the box and isn't a problem for most (small) forms. To fine tune the rerendering, you've gotta put a |
I have created a vue version of final-form, I had this issue on field arrays with thousand of fields but i had a trick , I disabled the validations in first render and then resume them with setTimeout(fn,0) and we are good now, I don't know this implemented in react version or no. |
Twitter thread about the "thousand field problem". @alirezavalizade, I don't know how easy it would be to throw your thing in there, but if it's doable, I'd love to have it. Also, if you're not talking about |
BTW, to the original poster, @romeovs, RFFv5 (and now v6) made a significant change, allowed by the glory of hooks, to how field-level validation is done on first render. This whole thing might be okay now. 🙏 |
We didn't publish it yet, but we are using it in our company, I'll make a pull request soon. |
@erikras how can it be controlled with hooks? |
You might be interested in to use this library (disclaimer: I am an author of the project). It is based on hooks and solves the issue of large form states. Here is the demo of a form with 5000 fields: state update on every keystroke without any performance lag. It supports validation too and you can write other custom plugins. |
@alirezavalizade any updates on that pull request? I'm facing this problem as well :( |
One year later, any updates? |
Did George R.R. Martin write that pull request, by chance? |
Running into this now :( |
@mmahalwy no one cares about this particular issue) |
@vasilich6107 sad |
"High performance" subscription-based form state management for React. |
Well well well, if it isn't an issue from 2018 |
sad story, looks like it's time to look for another form library |
Also experiencing performance issues, any updates? |
Short answerUse field validation and memoize the validation function(s). Long answerThe impact on your form is going to depend on whether you're using purely form validation or purely field level validation. Personally i'd strongly suggest against using both as each field registration triggers both the field and form level validations. In both cases we're facing a summation problem. For examples sake we'll say we have 100 fields in the form. In the form validation case, you'll have 100 calls to the form validation function as each field registration triggers the form validation. If you're dealing with a dynamic validation schema, each field addition increases the execution time of the schema validation, if you're using a static schema, the execution time of the validation is constant but it's likely to be the upper limit. Either way, the form validation function is being trashed however it may be necessary given each field addition may change the validation schema. In the field validation case, you have a summation of the number of total fields to be rendered. i.e for 100 fields, you run the field validation From some basic testing in our product, with no optimisation, using form validation tends to be faster. However, for the field validation case, after the first call to each of the field validation functions, the values passed to it most likely don't change, meaning it's a prime case for memoization. This does require your validation functions are pure (given the same inputs, it gives the same outputs), however I can't think of a case where a validation function couldn't be made pure. By memoizing the field validation function(s), after the first call to the validation function for each field, every other call during the initial render becomes constant time, meaning our summation is reduced to a linear number of calls i.e for 100 fields you have 100 ACTUAL executions of the validation function. Here's an example of the validation function we use. We make use of Yup for validation and memoizee for memoization.
By adding the memoization we reduced our form render time with 254 fields from ~ 15s down to ~4s. |
I have encountered serious performance difficulties when creating a table with draggable rows. Each row contained 5 input fields that used custom components from our library. Furthermore, I needed to add rows at the top (I couldn't use unshift, but that's another issue with keys). To start, I took a basic example for study. Everything worked quickly and well, but when adding custom components (any), the performance decreased. I created an example demonstrating the performance issues, which started around 100 rows and became very noticeable at 120 or more. Try quickly adding 100+ rows and start filling in input fields. You will notice lagging. The problem is that a full re-rendering occurs. And various variations of subscriptions do not help. After many experiments, I came to the conclusion that I should abandon using an array in favor of normalized data - an array of IDs and a data object. This also allowed me to solve the issues with the unshift and move mutators. However, my solution is not integrated with the form and is located in a separate part. Perhaps creating a custom mutator will allow for a more proper integration. @erikras I suggest adding a similar solution to the final-form-array |
Are you submitting a bug report or a feature request?
Bug report
What is the current behavior?
Forms with large amount of fields (>100, as is often the case with arrays and nested subforms) take a long time to render.
The slowdown is most notable on initial render, where the form can take up to 10s of seconds to render initially. Renders after that are usually fast enough, but not super-fast, causing some input delay.
If I run the Chrome devtools, I can see that the initial render slowdown is caused by
notifySubscriber
taking more and more time as more fields are being registered, making me think thatnotifySubscriber
is notO(1)
butO(fields already registered)
.I'm building forms recursively from a JSON schema, maybe
react-final-form
is not the right tool for the job since all fields are know in advance, so I don't need thenotifySubscriber
at all?What is the expected behavior?
Speedy Gonzales!
Sandbox Link
I'll try and put together an example soon!
What's your environment?
Other information
See also #230.
Here's a profile I've made with Google Chrome, you should be able to load that as well.
profile.json.txt
The text was updated successfully, but these errors were encountered: