fix: use more reliable comparison in deduping search results #6034
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
Since we moved the recent search results to a server-side service, we started seeing duplicate results in the global search results. The previous implementation, which stored search results in local storage, stringified recent searches before storing them and compared stringified new searches against them.
Since we switched to using a backend service, we could no longer rely on keys being in the same order. This PR uses the
isEqual
lodash method (used in many other places in the studio) to more reliably compare the values of the objects, rather than a string result.What to review
The changed file. Any performance or other repercussions from using this comparison.
Testing
Tests already exist for this utility, but did not catch this because of mocking. A ticket has been created to better address this in the future.
Notes for release
Resolves an issue where the server-side storage of global recent searches was storing duplicate recent searches. Recent searches should now all be unique terms.