Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

--yes can't skip confirmation of scale-out #1640

Closed
tabokie opened this issue Nov 25, 2021 · 0 comments · Fixed by #1645
Closed

--yes can't skip confirmation of scale-out #1640

tabokie opened this issue Nov 25, 2021 · 0 comments · Fixed by #1645
Assignees
Labels
type/bug Categorizes issue as related to a bug.

Comments

@tabokie
Copy link
Contributor

tabokie commented Nov 25, 2021

Bug Report

Please answer these questions before submitting your issue. Thanks!

  1. What did you do?

tiup cluster scale-out tidb-test ./data/topology.yaml --yes

  1. What did you expect to see?

Proceed without asking for confirmation.

  1. What did you see instead?
[root@Copy-of-VM-EE-CentOS76-v1 pub-pastebin]# tiup cluster scale-out tidb-test ./data/topology.yaml --yes
Starting component `cluster`: /root/.tiup/components/cluster/v1.7.0/tiup-cluster scale-out tidb-test ./data/topology.yaml --yes
You have one or more of ["global", "monitored", "server_configs"] fields configured in
the scale out topology, but they will be ignored during the scaling out process.
If you want to use configs different from the existing cluster, cancel now and
set them in the specification fileds for each host.
Do you want to continue? [y/N]: (default=N)
  1. What version of TiUP are you using (tiup --version)?
1.7.0 tiup
Go Version: go1.17.3
Git Ref: v1.7.0
GitHash: ce8eb0a645cc3ead96a44d67b1ecd5034d112cf0
  1. Workaround:

yes | tiup ...

@tabokie tabokie added the type/bug Categorizes issue as related to a bug. label Nov 25, 2021
@srstack srstack self-assigned this Dec 1, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type/bug Categorizes issue as related to a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants