Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

upgrade from v5.2.4 to v6.0.0 fail, tiflash upgrade fail #4705

Closed
seiya-annie opened this issue Apr 19, 2022 · 10 comments
Closed

upgrade from v5.2.4 to v6.0.0 fail, tiflash upgrade fail #4705

seiya-annie opened this issue Apr 19, 2022 · 10 comments
Assignees
Labels
affects-5.2 severity/critical type/bug The issue is confirmed as a bug.

Comments

@seiya-annie
Copy link

seiya-annie commented Apr 19, 2022

Bug Report

Please answer these questions before submitting your issue. Thanks!

1. Minimal reproduce step (Required)

install tidb cluster with v5.2.4, using default config
upgrade to v6.0.0

2. What did you expect to see? (Required)

upgrade success

3. What did you see instead (Required)

upgrade fail with:

2022.04.19 17:34:36.789966 [ 1 ] Application: The configuration "path" is deprecated. Check [storage] section for new style.
[2022/04/19 17:37:36.138 +08:00] [WARN] [StorageConfigParser.cpp:229] ["Application:The configuration "path" is deprecated. Check [storage] section for new style."] [thread_id=1]

4. What is your TiFlash version? (Required)

[2022/04/19 17:34:36.564 +08:00] [INFO] [] ["TiFlash build info: TiFlash\nRelease Version: v5.2.4\nEdition: Community\nGit Commit Hash: 493855d\nGit Branch: heads/refs/tags/v5.2.4\nUTC Build Time: 2022-04-18 03:16:43\nProfile: RELWITHDEBINFO\n"] [thread_id=1]

@seiya-annie seiya-annie added the type/bug The issue is confirmed as a bug. label Apr 19, 2022
@fzhedu
Copy link
Contributor

fzhedu commented Apr 19, 2022

the waring can be ignored. Do you use tiup to upgrade the cluster? use tiup cluster display to see the node status.

@seiya-annie
Copy link
Author

截屏2022-04-19 下午6 06 19
截屏2022-04-19 下午6 06 09

@seiya-annie
Copy link
Author

tiup report upgrade fail and exit

@seiya-annie
Copy link
Author

change wait-time to 300s, still fail

Upgrading component tiflash
Restarting instance 172.16.5.101:9000

Error: failed to restart: 172.16.5.101 tiflash-9000.service, please check the instance's log(/home/tidb/deploy/tiflash-9000/log) for more detail.: timed out waiting for tiflash 172.16.5.101:9000 to be ready after 300s: Get "http://172.16.5.101:20292/tiflash/store-status": dial tcp 172.16.5.101:20292: connect: connection refused

@seiya-annie seiya-annie changed the title v5.2.4 tiflash upgrade fail for "The configuration "path" is deprecated. " upgrade from v5.2.4 to v6.0.0 fail, tiflash upgrade fail Apr 19, 2022
@seiya-annie
Copy link
Author

tiflash_tikv.log

@seiya-annie
Copy link
Author

seiya-annie commented Apr 19, 2022

find fatal in tiflash_tikv.log

[2022/04/19 18:27:21.108 +08:00] [FATAL] [[lib.rs:463](http://lib.rs:463/)] ["failed to load_latest_options \"Invalid argument: Unable to parse the specified CF option disable_write_stall\""] [backtrace="stack backtrace:\n   0: tikv_util::set_panic_hook::{{closure}}\n   1: std::panicking::rust_panic_with_hook\n             at library/std/src/[panicking.rs:595](http://panicking.rs:595/)\n   2: std::panicking::begin_panic_handler::{{closure}}\n             at library/std/src/[panicking.rs:497](http://panicking.rs:497/)\n   3: std::sys_common::backtrace::__rust_end_short_backtrace\n             at library/std/src/sys_common/[backtrace.rs:141](http://backtrace.rs:141/)\n   4: rust_begin_unwind\n             at library/std/src/[panicking.rs:493](http://panicking.rs:493/)\n   5: std::panicking::begin_panic_fmt\n             at library/std/src/[panicking.rs:435](http://panicking.rs:435/)\n   6: engine_rocks::raw_util::new_engine_opt::{{closure}}\n   7: engine_rocks::raw_util::new_engine_opt\n   8: server::server::run_tikv\n   9: server::proxy::run_proxy\n  10: _ZN2DB20RaftStoreProxyRunner20runRaftStoreProxyFFIEPv\n             at /home/jenkins/agent/workspace/build-common/go/src/[github.com/pingcap/tics/dbms/src/Server/Server.cpp:461\n](http://github.com/pingcap/tics/dbms/src/Server/Server.cpp:461/n)  11: start_thread\n  12: clone\n"] [location=components/engine_rocks/src/[raw_util.rs:122](http://raw_util.rs:122/)] [thread_name=<unnamed>]```

@fzhedu
Copy link
Contributor

fzhedu commented Apr 19, 2022

[2022/04/19 18:27:21.108 +08:00] [FATAL] [lib.rs:463] ["failed to load_latest_options \"Invalid argument: Unable to parse the specified CF option disable_write_stall\""] [backtrace="stack backtrace:\n   0: tikv_util::set_panic_hook::{{closure}}\n   1: std::panicking::rust_panic_with_hook\n             at library/std/src/panicking.rs:595\n   2: std::panicking::begin_panic_handler::{{closure}}\n             at library/std/src/panicking.rs:497\n   3: std::sys_common::backtrace::__rust_end_short_backtrace\n             at library/std/src/sys_common/backtrace.rs:141\n   4: rust_begin_unwind\n             at library/std/src/panicking.rs:493\n   5: std::panicking::begin_panic_fmt\n             at library/std/src/panicking.rs:435\n   6: engine_rocks::raw_util::new_engine_opt::{{closure}}\n   7: engine_rocks::raw_util::new_engine_opt\n   8: server::server::run_tikv\n   9: server::proxy::run_proxy\n  10: _ZN2DB20RaftStoreProxyRunner20runRaftStoreProxyFFIEPv\n             at /home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tics/dbms/src/Server/Server.cpp:461\n  11: start_thread\n  12: clone\n"] [location=components/engine_rocks/src/raw_util.rs:122] [thread_name=<unnamed>]

according to this fatal message, please remove 'disable_write_stall' in the config.

@seiya-annie
Copy link
Author

no config set in topo file

@zanmato1984
Copy link
Contributor

PR merged. No other version affected. Can we close this one? @seiya-annie

@seiya-annie
Copy link
Author

PR merged. No other version affected. Can we close this one? @seiya-annie

ok, I closed it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
affects-5.2 severity/critical type/bug The issue is confirmed as a bug.
Projects
None yet
Development

No branches or pull requests

7 participants