Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle duplicate slices #122

Open
Zhack47 opened this issue Oct 7, 2024 · 0 comments
Open

Handle duplicate slices #122

Zhack47 opened this issue Oct 7, 2024 · 0 comments

Comments

@Zhack47
Copy link
Contributor

Zhack47 commented Oct 7, 2024

While using the rt-Utils library for RTstruct conversion, I noticed that:

If slices are present multiple times in the dicom path, they will be loaded as many times , resulting in a size N times bigger in the z-axis than it should be.

N being the number of times a slice is repeated

I fixed this by ensuring unique SOPInstanceUID in the list of files inside the load_dcm_images_from_path function:

def load_dcm_images_from_path(dicom_series_path: str) -> List[Dataset]:
    unique_sop_UIDs = []  # New
    series_data = []
    SOPInstanceUID_tag = Tag(0x00080018)  # New
    for root, _, files in os.walk(dicom_series_path):
        for file in files:
            try:
                ds = dcmread(os.path.join(root, file))
                if hasattr(ds, "pixel_array") and ds[SOPInstanceUID_tag] not in unique_sop_UIDs:  # Partly new (second check)
                    series_data.append(ds)
                    unique_sop_UIDs.append(ds[SOPInstanceUID_tag])  # New
            except Exception:
                # Not a valid DICOM file
                continue

    return series_data

If this is considered useful I can submit a pull request

Regards
Zhack

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant