-
Notifications
You must be signed in to change notification settings - Fork 7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add raft_large weights fined-tuned on Kitti #5081
Conversation
💊 CI failures summary and remediationsAs of commit 2c14700 (more details on the Dr. CI page):
🕵️ 2 new failures recognized by patternsThe following CI failures do not appear to be due to upstream breakages: unittest_linux_cpu_py3.8 (1/2)Step: "Run tests" (full log | diagnosis details | 🔁 rerun)
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks!
Reviewed By: fmassa Differential Revision: D33185012 fbshipit-source-id: 0cde871c6054416b86634865ec758034ec51519e
towards #4644
This PR adds pre-trained weights for raft-large, fine-tuned on Kitti.
I'm publishing both our weights and the original weights. The Kitti authors kindly allowed us to submit our code for evaluation on Kitti-test.
Our f1-epe on Kitti-test is 5.19 wheras the original is 5.10. We're a tiny bit higher (i.e. worse), but still significantly better than the other baselines that are compared against in the paper (next best one is 6.10).
There are submission restrictions, so to keep consistency with the C_T and C_T_SKHT weights, I just submitted the model that we already used in both (after kitti-specific fine-tuning). In other words these weights are litterally the current
C_T_SKHT_V
with the last fine-tuning step on Kitti.Evaluated on kitti train:
kitti test submission:
cc @datumbox