Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[2DC] Handle producer cluster changes #2365

Closed
ndeodhar opened this issue Sep 18, 2019 · 2 comments
Closed

[2DC] Handle producer cluster changes #2365

ndeodhar opened this issue Sep 18, 2019 · 2 comments
Assignees
Labels
area/cdc Change Data Capture
Milestone

Comments

@ndeodhar
Copy link
Contributor

Today, we get a copy of the producer's cluster while setting up replication. Later, when producer cluster changes (say, nodes are added/removed), consumer's view of the producer cluster is not updated.

We should modify CDC consumer to periodically get a copy of producer cluster.

@ndeodhar ndeodhar added the area/cdc Change Data Capture label Sep 18, 2019
@ndeodhar ndeodhar added this to the v2.1 milestone Sep 18, 2019
@rahuldesirazu
Copy link
Contributor

I feel like this should come for free when we have a meta cache implementation on the consumer side. That should automatically figure out new tablet locations when we add/remove servers on the producer cluster.

@nspiegelberg
Copy link
Contributor

This is a low priority feature until tablet splitting is added to YBase. The number of tablets is currently static, so producer cluster would need to have all 3 replicas as dead nodes with no proxy redirection before we would have connectivity issues.

nspiegelberg pushed a commit that referenced this issue Nov 7, 2019
…on CDC Consumer

Summary:
Depends on D7480
Previously, I created CDCReadRpc to handle CDC read requests using the TabletInvoker API.
TabletInvoker handles creating a <tablet, proxy> cache, handling replica failover & errors, and
encoding standard retry logic.  Using this on the CDC Consumer will solve a number of
outstanding tasks related to these issues.  Additionally, I added multi-universe support since I
needed to differentiate between the various clients created anyways.

Test Plan:
1. Used pgbench in a 3 node setup to send 100k messages.
2. ybd --cxx-test twodc-test -n 20

Reviewers: rahuldesirazu, hector, neha

Reviewed By: neha

Subscribers: sergei, ybase, bogdan

Differential Revision: https://phabricator.dev.yugabyte.com/D7456
@ndeodhar ndeodhar closed this as completed Jun 8, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/cdc Change Data Capture
Projects
None yet
Development

No branches or pull requests

3 participants