react-final-form
react-final-form copied to clipboard
Performance issues with large forms
Are you submitting a bug report or a feature request?
Bug report
What is the current behavior?
Forms with large amount of fields (>100, as is often the case with arrays and nested subforms) take a long time to render.
The slowdown is most notable on initial render, where the form can take up to 10s of seconds to render initially. Renders after that are usually fast enough, but not super-fast, causing some input delay.
If I run the Chrome devtools, I can see that the initial render slowdown is caused by notifySubscriber
taking more and more time as more fields are being registered, making me think that notifySubscriber
is not O(1)
but O(fields already registered)
.
I'm building forms recursively from a JSON schema, maybe react-final-form
is not the right tool for the job since all fields are know in advance, so I don't need the notifySubscriber
at all?
What is the expected behavior?
Speedy Gonzales!
Sandbox Link
I'll try and put together an example soon!
What's your environment?
final-form 4.6.1
final-form-arrays 1.0.4
react 16.3.2
react-final-form 3.4.2
react-final-form-arrays 1.0.4
Other information
See also #230.
Here's a profile I've made with Google Chrome, you should be able to load that as well. profile.json.txt
I managed to optimize this by limiting the subscriptions and pausing the validation but I still have an irreducible problem during the unmount, @erikras do you have an idea maybe?
Note that I am seeing validations run on unmount - while not necessarily the same case it is related to performance of large or complex forms with validations #408
Maybe it's better pass a validateOnBlur prop, I didn't try it but i think it could be effective
The validation has to run whenever a new field is added, but on initial render, there is a mechanism that should (perhaps it's broken here?) pause all validation from running until the entire form has been rendered (i.e. all fields registered) and then runs validation. It'll require some investigation...
I'm having the same performance issue when dealing with large forms. I created an example to show the re-render issue: https://codesandbox.io/embed/react-final-form-simple-example-bsbtj
Any field change will cause the whole form re-render, deep into every field.
I tried to apply validateOnBlur
to the <Form />
level, but it doesn't reduce the frequency of re-render. Every key down will trigger the whole form re-render.
Is this an expected behavior? Or the way I'm using it is wrong.
Any ideas? Thanks,
@lazurey Yep, that's by design because it's easier out of the box and isn't a problem for most (small) forms. To fine tune the rerendering, you've gotta put a subscription
on the <Form>
. See the 💥 Performance Optimization Through Subscriptions 💥 example.
I have created a vue version of final-form, I had this issue on field arrays with thousand of fields but i had a trick , I disabled the validations in first render and then resume them with setTimeout(fn,0) and we are good now, I don't know this implemented in react version or no. Another trick is i created stateful wrapper around the fields and used debounce function. The subscriptions didn't help me, @erikras
Twitter thread about the "thousand field problem". @alirezavalizade, I don't know how easy it would be to throw your thing in there, but if it's doable, I'd love to have it.
Also, if you're not talking about vue-final-form
, I'd be happy to link to your thing on the FF readme.
BTW, to the original poster, @romeovs, RFFv5 (and now v6) made a significant change, allowed by the glory of hooks, to how field-level validation is done on first render. This whole thing might be okay now. 🙏
We didn't publish it yet, but we are using it in our company, I'll make a pull request soon.
@erikras how can it be controlled with hooks? That's still an issue.
You might be interested in to use this library (disclaimer: I am an author of the project). It is based on hooks and solves the issue of large form states. Here is the demo of a form with 5000 fields: state update on every keystroke without any performance lag. It supports validation too and you can write other custom plugins.
@alirezavalizade any updates on that pull request? I'm facing this problem as well :(
One year later, any updates?
Did George R.R. Martin write that pull request, by chance?
Running into this now :(
@mmahalwy no one cares about this particular issue)
@vasilich6107 sad
"High performance" subscription-based form state management for React.
Well well well, if it isn't an issue from 2018
sad story, looks like it's time to look for another form library
Also experiencing performance issues, any updates?
Short answer
Use field validation and memoize the validation function(s).
Long answer
The impact on your form is going to depend on whether you're using purely form validation or purely field level validation. Personally i'd strongly suggest against using both as each field registration triggers both the field and form level validations. In both cases we're facing a summation problem.
For examples sake we'll say we have 100 fields in the form.
In the form validation case, you'll have 100 calls to the form validation function as each field registration triggers the form validation. If you're dealing with a dynamic validation schema, each field addition increases the execution time of the schema validation, if you're using a static schema, the execution time of the validation is constant but it's likely to be the upper limit. Either way, the form validation function is being trashed however it may be necessary given each field addition may change the validation schema.
In the field validation case, you have a summation of the number of total fields to be rendered. i.e for 100 fields, you run the field validation 1 + 2 + 3 + 4 ... + 100
times. In total that's 5050 calls for 100 fields.
From some basic testing in our product, with no optimisation, using form validation tends to be faster. However, for the field validation case, after the first call to each of the field validation functions, the values passed to it most likely don't change, meaning it's a prime case for memoization. This does require your validation functions are pure (given the same inputs, it gives the same outputs), however I can't think of a case where a validation function couldn't be made pure.
By memoizing the field validation function(s), after the first call to the validation function for each field, every other call during the initial render becomes constant time, meaning our summation is reduced to a linear number of calls i.e for 100 fields you have 100 ACTUAL executions of the validation function.
Here's an example of the validation function we use. We make use of Yup for validation and memoizee for memoization.
/**
* Memoized application of the fields schema to the given value.
* Only the first three parameters are included as comparisons in the memoize cache evaluation.
* @param fieldName The field being validated
* @param value The fields value
* @param schema The fields schema
* @returns The validation error, or undefined if there is none.
*
* NOTE: The cache is set to 1000. This means AT MOST, there may be 1000 fields (field validations) cached at a time.
* Over this number, and potentially before depending on what operations are occurring, all additional fields over this value
* will begin causing the validation thrashing previously seen. If you some how hit 1000 fields in your form, maybe you should
* reconsider that and not this...
*/
const validateField = memoize(
(
fieldName: string,
value: any,
schema: AnySchema
): ValidationError | undefined => {
// Uncomment this line to check cache hits
// console.log(`Validating field: ${fieldName}`);
try {
schema.validateSync(value, {
strict: false,
abortEarly: false,
stripUnknown: true,
});
return undefined;
} catch (err) {
// For a result to be cached, we can't throw an error, so instead just return it
return err as ValidationError;
}
},
{ length: 3, max: 1000, profileName: 'Field Validation' }
);
By adding the memoization we reduced our form render time with 254 fields from ~ 15s down to ~4s.
I have encountered serious performance difficulties when creating a table with draggable rows. Each row contained 5 input fields that used custom components from our library.
Furthermore, I needed to add rows at the top (I couldn't use unshift, but that's another issue with keys).
To start, I took a basic example for study. https://codesandbox.io/s/react-final-form-field-arrays-react-beatiful-dnd-as-drag-drop-forked-9qhp3m
Everything worked quickly and well, but when adding custom components (any), the performance decreased. I created an example demonstrating the performance issues, which started around 100 rows and became very noticeable at 120 or more. Try quickly adding 100+ rows and start filling in input fields. You will notice lagging. https://codesandbox.io/s/react-final-form-field-arrays-react-beatiful-dnd-as-drag-drop-forked-jqtfmc
The problem is that a full re-rendering occurs. And various variations of subscriptions do not help.
After many experiments, I came to the conclusion that I should abandon using an array in favor of normalized data - an array of IDs and a data object. This also allowed me to solve the issues with the unshift and move mutators.
https://codesandbox.io/s/react-final-form-rows-field-array-alternative-react-dnd-as-drag-drop-6vrpy6
https://github.com/makhnatkin/react-final-form-rows/blob/main/src/useRows.ts
However, my solution is not integrated with the form and is located in a separate part. Perhaps creating a custom mutator will allow for a more proper integration.
@erikras I suggest adding a similar solution to the final-form-array