Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add initialValue, delayUpdates to allow components to eagerly render. #329

Merged
merged 17 commits into from
Sep 19, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 27 additions & 17 deletions API.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,18 +25,21 @@ If the requested key is a collection, it will return an object with all the coll
<dt><a href="#disconnect">disconnect(connectionID, [keyToRemoveFromEvictionBlocklist])</a></dt>
<dd><p>Remove the listener for a react component</p>
</dd>
<dt><a href="#notifySubscribersOnNextTick">notifySubscribersOnNextTick(key, value, [canUpdateSubscriber])</a></dt>
<dd><p>This method mostly exists for historical reasons as this library was initially designed without a memory cache and one was added later.
For this reason, Onyx works more similar to what you might expect from a native AsyncStorage with reads, writes, etc all becoming
available async. Since we have code in our main applications that might expect things to work this way it&#39;s not safe to change this
behavior just yet.</p>
<dt><a href="#maybeFlushBatchUpdates">maybeFlushBatchUpdates()</a> ⇒ <code>Promise</code></dt>
<dd><p>We are batching together onyx updates. This helps with use cases where we schedule onyx updates after each other.
This happens for example in the Onyx.update function, where we process API responses that might contain a lot of
update operations. Instead of calling the subscribers for each update operation, we batch them together which will
cause react to schedule the updates at once instead of after each other. This is mainly a performance optimization.</p>
</dd>
<dt><a href="#notifyCollectionSubscribersOnNextTick">notifyCollectionSubscribersOnNextTick(key, value)</a></dt>
<dt><a href="#scheduleSubscriberUpdate">scheduleSubscriberUpdate(key, value, [canUpdateSubscriber])</a> ⇒ <code>Promise</code></dt>
<dd><p>Schedules an update that will be appended to the macro task queue (so it doesn&#39;t update the subscribers immediately).</p>
</dd>
<dt><a href="#scheduleNotifyCollectionSubscribers">scheduleNotifyCollectionSubscribers(key, value)</a> ⇒ <code>Promise</code></dt>
<dd><p>This method is similar to notifySubscribersOnNextTick but it is built for working specifically with collections
so that keysChanged() is triggered for the collection and not keyChanged(). If this was not done, then the
subscriber callbacks receive the data in a different format than they normally expect and it breaks code.</p>
</dd>
<dt><a href="#broadcastUpdate">broadcastUpdate(key, value, hasChanged, method)</a></dt>
<dt><a href="#broadcastUpdate">broadcastUpdate(key, value, hasChanged, method)</a> ⇒ <code>Promise</code></dt>
<dd><p>Notifys subscribers and writes current value to cache</p>
</dd>
<dt><a href="#hasPendingMergeForKey">hasPendingMergeForKey(key)</a> ⇒ <code>Boolean</code></dt>
Expand Down Expand Up @@ -154,6 +157,7 @@ Subscribes a react component's state directly to a store key
| [mapping.initWithStoredValues] | <code>Boolean</code> | If set to false, then no data will be prefilled into the component |
| [mapping.waitForCollectionCallback] | <code>Boolean</code> | If set to true, it will return the entire collection to the callback as a single object |
| [mapping.selector] | <code>function</code> | THIS PARAM IS ONLY USED WITH withOnyx(). If included, this will be used to subscribe to a subset of an Onyx key's data. The sourceData and withOnyx state are passed to the selector and should return the simplified data. Using this setting on `withOnyx` can have very positive performance benefits because the component will only re-render when the subset of data changes. Otherwise, any change of data on any property would normally cause the component to re-render (and that can be expensive from a performance standpoint). |
| [mapping.initialValue] | <code>String</code> \| <code>Number</code> \| <code>Boolean</code> \| <code>Object</code> | THIS PARAM IS ONLY USED WITH withOnyx(). If included, this will be passed to the component so that something can be rendered while data is being fetched from the DB. Note that it will not cause the component to have the loading prop set to true. | |

**Example**
```js
Expand All @@ -178,13 +182,19 @@ Remove the listener for a react component
```js
Onyx.disconnect(connectionID);
```
<a name="notifySubscribersOnNextTick"></a>
<a name="maybeFlushBatchUpdates"></a>

## maybeFlushBatchUpdates() ⇒ <code>Promise</code>
We are batching together onyx updates. This helps with use cases where we schedule onyx updates after each other.
This happens for example in the Onyx.update function, where we process API responses that might contain a lot of
update operations. Instead of calling the subscribers for each update operation, we batch them together which will
cause react to schedule the updates at once instead of after each other. This is mainly a performance optimization.

**Kind**: global function
<a name="scheduleSubscriberUpdate"></a>

## notifySubscribersOnNextTick(key, value, [canUpdateSubscriber])
This method mostly exists for historical reasons as this library was initially designed without a memory cache and one was added later.
For this reason, Onyx works more similar to what you might expect from a native AsyncStorage with reads, writes, etc all becoming
available async. Since we have code in our main applications that might expect things to work this way it's not safe to change this
behavior just yet.
## scheduleSubscriberUpdate(key, value, [canUpdateSubscriber]) ⇒ <code>Promise</code>
Schedules an update that will be appended to the macro task queue (so it doesn't update the subscribers immediately).

**Kind**: global function

Expand All @@ -196,11 +206,11 @@ behavior just yet.

**Example**
```js
notifySubscribersOnNextTick(key, value, subscriber => subscriber.initWithStoredValues === false)
scheduleSubscriberUpdate(key, value, subscriber => subscriber.initWithStoredValues === false)
```
<a name="notifyCollectionSubscribersOnNextTick"></a>
<a name="scheduleNotifyCollectionSubscribers"></a>

## notifyCollectionSubscribersOnNextTick(key, value)
## scheduleNotifyCollectionSubscribers(key, value) ⇒ <code>Promise</code>
This method is similar to notifySubscribersOnNextTick but it is built for working specifically with collections
so that keysChanged() is triggered for the collection and not keyChanged(). If this was not done, then the
subscriber callbacks receive the data in a different format than they normally expect and it breaks code.
Expand All @@ -214,7 +224,7 @@ subscriber callbacks receive the data in a different format than they normally e

<a name="broadcastUpdate"></a>

## broadcastUpdate(key, value, hasChanged, method)
## broadcastUpdate(key, value, hasChanged, method) ⇒ <code>Promise</code>
Notifys subscribers and writes current value to cache

**Kind**: global function
Expand Down
29 changes: 28 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,34 @@ export default withOnyx({
})(App);
```

It is preferable to use the HOC over `Onyx.connect()` in React code as `withOnyx()` will delay the rendering of the wrapped component until all keys have been accessed and made available.
While `Onyx.connect()` gives you more control on how your component reacts as data is fetched from disk, `withOnyx()` will delay the rendering of the wrapped component until all keys/entities have been fetched and passed to the component, this can be convenient for simple cases. This however, can really delay your application if many entities are connected to the same component, you can pass an `initialValue` to each key to allow Onyx to eagerly render your component with this value.

```javascript
export default withOnyx({
session: {
key: ONYXKEYS.SESSION,
initialValue: {}
},
})(App);
```

Additionally, if your component has many keys/entities when your component will mount but will receive many updates as data is fetched from DB and passed down to it, as every key that gets fetched will trigger a `setState` on the `withOnyx` HOC. This might cause re-renders on the initial mounting, preventing the component from mounting/rendering in reasonable time, making your app feel slow and even delaying animations. You can workaround this by passing an additional object with the `shouldDelayUpdates` property set to true. Onyx will then put all the updates in a queue until you decide when then should be applied, the component will receive a function `markReadyForHydration`. A good place to call this function is on the `onLayout` method, which gets triggered after your component has been rendered.

```javascript
const App = ({session, markReadyForHydration}) => (
<View onLayout={() => markReadyForHydration()}>
{session.token ? <Text>Logged in</Text> : <Text>Logged out</Text> }
</View>
);

// Second argument to funciton is `shouldDelayUpdates`
export default withOnyx({
session: {
key: ONYXKEYS.SESSION,
initialValue: {}
},
}, true)(App);
```

### Dependent Onyx Keys and withOnyx()
Some components need to subscribe to multiple Onyx keys at once and sometimes, one key might rely on the data from another key. This is similar to a JOIN in SQL.
Expand Down
42 changes: 27 additions & 15 deletions lib/Onyx.js
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,9 @@ import * as Logger from './Logger';
import cache from './OnyxCache';
import * as Str from './Str';
import createDeferredTask from './createDeferredTask';
import fastMerge from './fastMerge';
import * as PerformanceUtils from './metrics/PerformanceUtils';
import Storage from './storage';
import Utils from './utils';
import utils from './utils';
import unstable_batchedUpdates from './batch';

// Method constants
Expand Down Expand Up @@ -415,7 +414,7 @@ function keysChanged(collectionKey, partialCollection, notifyRegularSubscibers =
// If the subscriber has a selector, then the component's state must only be updated with the data
// returned by the selector.
if (subscriber.selector) {
subscriber.withOnyxInstance.setState((prevState) => {
subscriber.withOnyxInstance.setStateProxy((prevState) => {
const previousData = prevState[subscriber.statePropertyName];
const newData = reduceCollectionWithSelector(cachedCollection, subscriber.selector, subscriber.withOnyxInstance.state);

Expand All @@ -429,7 +428,7 @@ function keysChanged(collectionKey, partialCollection, notifyRegularSubscibers =
continue;
}

subscriber.withOnyxInstance.setState((prevState) => {
subscriber.withOnyxInstance.setStateProxy((prevState) => {
const finalCollection = _.clone(prevState[subscriber.statePropertyName] || {});
const dataKeys = _.keys(partialCollection);
for (let j = 0; j < dataKeys.length; j++) {
Expand Down Expand Up @@ -458,7 +457,7 @@ function keysChanged(collectionKey, partialCollection, notifyRegularSubscibers =
// returned by the selector and the state should only change when the subset of data changes from what
// it was previously.
if (subscriber.selector) {
subscriber.withOnyxInstance.setState((prevState) => {
subscriber.withOnyxInstance.setStateProxy((prevState) => {
const prevData = prevState[subscriber.statePropertyName];
const newData = getSubsetOfData(cachedCollection[subscriber.key], subscriber.selector, subscriber.withOnyxInstance.state);
if (!deepEqual(prevData, newData)) {
Expand All @@ -473,9 +472,14 @@ function keysChanged(collectionKey, partialCollection, notifyRegularSubscibers =
continue;
}

subscriber.withOnyxInstance.setState((prevState) => {
subscriber.withOnyxInstance.setStateProxy((prevState) => {
const data = cachedCollection[subscriber.key];
const previousData = prevState[subscriber.statePropertyName];

// Avoids triggering unnecessary re-renders when feeding empty objects
if (utils.areObjectsEmpty(data, previousData)) {
marcaaron marked this conversation as resolved.
Show resolved Hide resolved
return null;
}
if (data === previousData) {
return null;
}
Expand Down Expand Up @@ -548,7 +552,7 @@ function keyChanged(key, data, canUpdateSubscriber, notifyRegularSubscibers = tr
// If the subscriber has a selector, then the consumer of this data must only be given the data
// returned by the selector and only when the selected data has changed.
if (subscriber.selector) {
subscriber.withOnyxInstance.setState((prevState) => {
subscriber.withOnyxInstance.setStateProxy((prevState) => {
const prevData = prevState[subscriber.statePropertyName];
const newData = {
[key]: getSubsetOfData(data, subscriber.selector, subscriber.withOnyxInstance.state),
Expand All @@ -568,7 +572,7 @@ function keyChanged(key, data, canUpdateSubscriber, notifyRegularSubscibers = tr
continue;
}

subscriber.withOnyxInstance.setState((prevState) => {
subscriber.withOnyxInstance.setStateProxy((prevState) => {
const collection = prevState[subscriber.statePropertyName] || {};
const newCollection = {
...collection,
Expand All @@ -585,7 +589,7 @@ function keyChanged(key, data, canUpdateSubscriber, notifyRegularSubscibers = tr
// If the subscriber has a selector, then the component's state must only be updated with the data
// returned by the selector and only if the selected data has changed.
if (subscriber.selector) {
subscriber.withOnyxInstance.setState((prevState) => {
subscriber.withOnyxInstance.setStateProxy((prevState) => {
const previousValue = getSubsetOfData(prevState[subscriber.statePropertyName], subscriber.selector, subscriber.withOnyxInstance.state);
const newValue = getSubsetOfData(data, subscriber.selector, subscriber.withOnyxInstance.state);
if (!deepEqual(previousValue, newValue)) {
Expand All @@ -599,8 +603,13 @@ function keyChanged(key, data, canUpdateSubscriber, notifyRegularSubscibers = tr
}

// If we did not match on a collection key then we just set the new data to the state property
subscriber.withOnyxInstance.setState((prevState) => {
subscriber.withOnyxInstance.setStateProxy((prevState) => {
const previousData = prevState[subscriber.statePropertyName];

// Avoids triggering unnecessary re-renders when feeding empty objects
if (utils.areObjectsEmpty(data, previousData)) {
return null;
}
if (previousData === data) {
return null;
}
Expand Down Expand Up @@ -728,6 +737,9 @@ function getCollectionDataAndSendAsObject(matchingKeys, mapping) {
* The sourceData and withOnyx state are passed to the selector and should return the simplified data. Using this setting on `withOnyx` can have very positive
* performance benefits because the component will only re-render when the subset of data changes. Otherwise, any change of data on any property would normally
* cause the component to re-render (and that can be expensive from a performance standpoint).
* @param {String | Number | Boolean | Object} [mapping.initialValue] THIS PARAM IS ONLY USED WITH withOnyx().
* If included, this will be passed to the component so that something can be rendered while data is being fetched from the DB.
* Note that it will not cause the component to have the loading prop set to true. |
* @returns {Number} an ID to use when calling disconnect
*/
function connect(mapping) {
Expand Down Expand Up @@ -1008,7 +1020,7 @@ function set(key, value) {
Logger.logAlert(`Onyx.set() called after Onyx.merge() for key: ${key}. It is recommended to use set() or merge() not both.`);
}

const valueWithNullRemoved = Utils.removeNullObjectValues(value);
const valueWithNullRemoved = utils.removeNullObjectValues(value);

const hasChanged = cache.hasValueChanged(key, valueWithNullRemoved);

Expand Down Expand Up @@ -1078,7 +1090,7 @@ function applyMerge(existingValue, changes) {
// Object values are merged one after the other
// lodash adds a small overhead so we don't use it here
// eslint-disable-next-line prefer-object-spread, rulesdir/prefer-underscore-method
return _.reduce(changes, (modifiedData, change) => fastMerge(modifiedData, change),
return _.reduce(changes, (modifiedData, change) => utils.fastMerge(modifiedData, change),
existingValue || {});
}

Expand Down Expand Up @@ -1127,14 +1139,14 @@ function merge(key, changes) {
delete mergeQueuePromise[key];

// After that we merge the batched changes with the existing value
const modifiedData = Utils.removeNullObjectValues(applyMerge(existingValue, [batchedChanges]));
const modifiedData = utils.removeNullObjectValues(applyMerge(existingValue, [batchedChanges]));

// On native platforms we use SQLite which utilises JSON_PATCH to merge changes.
// JSON_PATCH generally removes top-level nullish values from the stored object.
// When there is no existing value though, SQLite will just insert the changes as a new value and thus the top-level nullish values won't be removed.
// Therefore we need to remove nullish values from the `batchedChanges` which are sent to the SQLite, if no existing value is present.
if (!existingValue) {
batchedChanges = Utils.removeNullObjectValues(batchedChanges);
batchedChanges = utils.removeNullObjectValues(batchedChanges);
}

const hasChanged = cache.hasValueChanged(key, modifiedData);
Expand Down Expand Up @@ -1168,7 +1180,7 @@ function initializeWithDefaultKeyStates() {
.then((pairs) => {
const asObject = _.object(pairs);

const merged = fastMerge(asObject, defaultKeyStates);
const merged = utils.fastMerge(asObject, defaultKeyStates);
cache.merge(merged);
_.each(merged, (val, key) => keyChanged(key, val));
});
Expand Down
4 changes: 2 additions & 2 deletions lib/OnyxCache.js
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import _ from 'underscore';
import {deepEqual} from 'fast-equals';
import fastMerge from './fastMerge';
import utils from './utils';

const isDefined = _.negate(_.isUndefined);

Expand Down Expand Up @@ -119,7 +119,7 @@ class OnyxCache {

// lodash adds a small overhead so we don't use it here
// eslint-disable-next-line prefer-object-spread, rulesdir/prefer-underscore-method
this.storageMap = Object.assign({}, fastMerge(this.storageMap, data));
this.storageMap = Object.assign({}, utils.fastMerge(this.storageMap, data));

const storageKeys = this.getAllKeys();
const mergedKeys = _.keys(data);
Expand Down
4 changes: 2 additions & 2 deletions lib/storage/__mocks__/index.js
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import _ from 'underscore';
import fastMerge from '../../fastMerge';
import utils from '../../utils';

let storageMapInternal = {};

Expand Down Expand Up @@ -27,7 +27,7 @@ const idbKeyvalMock = {
_.forEach(pairs, ([key, value]) => {
const existingValue = storageMapInternal[key];
const newValue = _.isObject(existingValue)
? fastMerge(existingValue, value) : value;
? utils.fastMerge(existingValue, value) : value;

set(key, newValue);
});
Expand Down
Loading
Loading