Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow retracking #867

Closed
getzze opened this issue Jul 27, 2022 · 3 comments
Closed

Allow retracking #867

getzze opened this issue Jul 27, 2022 · 3 comments
Assignees
Labels
bug Something isn't working

Comments

@getzze
Copy link
Contributor

getzze commented Jul 27, 2022

Hi!
There was an issue allowing for retracking predicted instances without having to infer the instances again: #260

But it is not working anymore with Sleap v1.2.4.

If the Inference Pipeline Type in the GUI is set to None, and Tracker set to Flow or Simple, it returns with an error that it can run the inference because no Model was specified.

  • Versions:
    SLEAP: 1.2.4
    TensorFlow: 2.6.3
    Numpy: 1.19.5
    Python: 3.7.12
    OS: Windows-10-10.0.19041-SP0

  • SLEAP installed with Conda from package

@getzze getzze added the bug Something isn't working label Jul 27, 2022
@talmo
Copy link
Collaborator

talmo commented Jul 27, 2022

Thanks for the report @getzze! We'll look into this and try to get a fix up ASAP.

In the meantime, you might be able to use this notebook as a workaround to do retracking without inference: https://sleap.ai/notebooks/Post_inference_tracking.html

Cheers,

Talmo

@roomrys
Copy link
Collaborator

roomrys commented Aug 4, 2022

Problem Analysis

Initially, on Nov 20, 2019 - Add cli for running tracking by itself, we added a function in tracking.py called retrack() that is only called if __name__ == '__main__' and is not able to be accessed through any SLEAP command. Although the retracking feature was added on Jan 22, 2020 via Add tracking-only ui to sleap-track, this commit was later removed seemingly by accident on Mar 19, 2020 via Revert to 28a0031.

Proposal

The code base has changed considerably since Add tracking-only ui to sleap-track, so we need a custom solution - no copy paste. We can mesh the logic from retrack() into the sleap-track command.

Relevant Code

  1. Make the predictor from CLI args

    sleap/sleap/nn/inference.py

    Lines 4287 to 4288 in 44e4661

    # Setup models.
    predictor = _make_predictor_from_cli(args)

  2. Create the predictor from the model

    sleap/sleap/nn/inference.py

    Lines 4209 to 4215 in 44e4661

    predictor = load_model(
    args.models,
    peak_threshold=peak_threshold,
    batch_size=batch_size,
    refinement="integral",
    progress_reporting=args.verbosity,
    )

  3. The Predictor instance gets created inside load_model

    sleap/sleap/nn/inference.py

    Lines 3901 to 3906 in 44e4661

    predictor = Predictor.from_model_paths(
    model_paths,
    peak_threshold=peak_threshold,
    integral_refinement=refinement == "integral",
    batch_size=batch_size,
    )

  4. The Tracker is added to the Predictor inside load_model

    sleap/sleap/nn/inference.py

    Lines 3908 to 3914 in 44e4661

    if tracker is not None:
    predictor.tracker = Tracker.make_tracker_by_name(
    tracker=tracker,
    track_window=tracker_window,
    post_connect_single_breaks=True,
    clean_instance_count=tracker_max_instances,
    )

  5. Run inference

    sleap/sleap/nn/inference.py

    Lines 4294 to 4295 in 44e4661

    # Run inference!
    labels_pr = predictor.predict(provider)

  6. Generate predictions

    sleap/sleap/nn/inference.py

    Lines 431 to 437 in 44e4661

    generator = self._predict_generator(data)
    if make_labels:
    # Create SLEAP data structures while consuming results.
    return sleap.Labels(
    self._make_labeled_frames_from_generator(generator, data)
    )

6a. Process batch

sleap/sleap/nn/inference.py

Lines 402 to 403 in 44e4661

for ex in self.pipeline.make_dataset():
yield process_batch(ex)

6b. Run inference on batch inside process_batch (note: self.inference_model is created through abstract method Predict._initialize_inference_model)

sleap/sleap/nn/inference.py

Lines 311 to 312 in 44e4661

# Run inference on current batch.
preds = self.inference_model.predict_on_batch(ex, numpy=True)

6c. Call super=tf.keras.Model to predict on a single batch inside predict_on_batch
outs = super().predict_on_batch(data, **kwargs)

  1. Make labeled frames from generated predictions

    sleap/sleap/nn/inference.py

    Lines 433 to 437 in 44e4661

    if make_labels:
    # Create SLEAP data structures while consuming results.
    return sleap.Labels(
    self._make_labeled_frames_from_generator(generator, data)
    )

7a. If the model is either TopDownPredictor or BottomUpPredictor, then call the tracker inside Predict._make_labeled_frames_from_generator...

sleap/sleap/nn/inference.py

Lines 2164 to 2168 in 44e4661

if self.tracker:
# Set tracks for predicted instances in this frame.
predicted_instances = self.tracker.track(
untracked_instances=predicted_instances, img=image, t=frame_ind
)

sleap/sleap/nn/inference.py

Lines 2699 to 2703 in 44e4661

if self.tracker:
# Set tracks for predicted instances in this frame.
predicted_instances = self.tracker.track(
untracked_instances=predicted_instances, img=image, t=frame_ind
)

7b. ... and do some post-processing track cleaning

sleap/sleap/nn/inference.py

Lines 2178 to 2179 in 44e4661

if self.tracker:
self.tracker.final_pass(predicted_frames)

sleap/sleap/nn/inference.py

Lines 2713 to 2714 in 44e4661

if self.tracker:
self.tracker.final_pass(predicted_frames)

@roomrys roomrys self-assigned this Aug 8, 2022
@roomrys roomrys added the fixed in future release Fix or feature is merged into develop and will be available in future release. label Aug 9, 2022
@roomrys
Copy link
Collaborator

roomrys commented Sep 12, 2022

This issue has been resolved in the new release - install SLEAP v1.2.7 here.

@roomrys roomrys closed this as completed Sep 12, 2022
@roomrys roomrys removed the fixed in future release Fix or feature is merged into develop and will be available in future release. label Sep 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants