Skip to content

Commit

Permalink
Merge pull request #312 from jungmannlab/development
Browse files Browse the repository at this point in the history
Development
  • Loading branch information
rafalkowalewski1 authored Feb 7, 2023
2 parents 92317e2 + 89d75fe commit 7f4d285
Show file tree
Hide file tree
Showing 33 changed files with 718 additions and 348 deletions.
2 changes: 1 addition & 1 deletion .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[bumpversion]
current_version = 0.5.6
current_version = 0.5.7
commit = True
tag = False
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(\-(?P<release>[a-z]+)(?P<build>\d+))?
Expand Down
8 changes: 7 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,12 @@ wheels/
.installed.cfg
*.egg

# Plugins
picasso/gui/plugins/

# Camera configuration
picasso/config.yaml

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
Expand Down Expand Up @@ -69,7 +75,7 @@ docs/_build/
target/

# Mac Desktop Service Store
.DS_Store
**/.DS_Store

# Jupyter Notebook
.ipynb_checkpoints
Expand Down
15 changes: 14 additions & 1 deletion changelog.rst
Original file line number Diff line number Diff line change
@@ -1,13 +1,26 @@
Changelog
=========

Last change: 01-DEC-2022 MTS
Last change: 07-FEB-2023 MTS

0.5.7
-----
- Updated installation instructions
- (H)DBSCAN available from cmd (bug fix)
- Render group information is faster (e.g., clustered data)
- Test Clusterer window (Render) has multiple updates, e.g., different projections, cluster centers display
- Cluster centers contain info about std in x,y and z
- If localization precision in z-axis is provided, it will be rendered when using ``Individual localization precision`` and ``Individual localization precision (iso)``. **NOTE:** the column must be named ``lpz`` and have the same units as ``lpx`` and ``lpy``.
- Number of CPU cores used in multiprocessing limited at 60
- Updated 3D rendering and clustering documentation
- Bug fixes

0.5.5 - 0.5.6
-------------
- Cluster info is saved in ``_cluster_centers.hdf5`` files which are created when ``Save cluster centers`` box is ticked
- Cluster centers contain info about group, mean frame (saved as ``frame``), standard deviation frame, area/volume and convex hull
- ``gist_rainbow`` is used for rendering properties
- NeNA can be calculated many times
- Bug fixes

0.5.1 - 0.5.4
Expand Down
4 changes: 2 additions & 2 deletions distribution/picasso.iss
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@
AppName=Picasso
AppPublisher=Jungmann Lab, Max Planck Institute of Biochemistry

AppVersion=0.5.6
AppVersion=0.5.7
DefaultDirName={pf}\Picasso
DefaultGroupName=Picasso
OutputBaseFilename="Picasso-Windows-64bit-0.5.6"
OutputBaseFilename="Picasso-Windows-64bit-0.5.7"
ArchitecturesAllowed=x64
ArchitecturesInstallIn64BitMode=x64

Expand Down
6 changes: 6 additions & 0 deletions docs/cmd.rst
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,12 @@ hdbscan
-------
Cluster localizations with the hdbscan clustering algorithm.

smlm_cluster
------------
Cluster localizations with the custom SMLM clustering algorithm.

The algorithm finds localizations with the most neighbors within a specified radius and finds clusters based on such "local maxima".

dark
----
Compute the dark time for grouped localizations.
Expand Down
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
# The short X.Y version
version = ""
# The full version, including alpha/beta/rc tags
release = "0.5.6"
release = "0.5.7"

# -- General configuration ---------------------------------------------------

Expand Down
15 changes: 9 additions & 6 deletions docs/render.rst
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,8 @@ The 3D rotation window allows the user to render 3D localization data. To use it

The user may perform multiple actions in the rotation window, including: saving rotated localizations, building animations (.mp4 format), rotating by a specified angle, etc.

Rotation around z-axis is available by pressing Ctrl/Command. Rotation axis can be frozen by pressing x/y/z to freeze around the corresponding axes (to freeze around the z-axis, Ctrl/Command must be pressed as well).

There are several things to keep in mind when using the rotation window. Firstly, using individual localization precision is very slow and is not recommended as a default blur method. Also, the size of the rotation window can be altered, however, if it becomes too large, rendering may start to lag.

Dialogs
Expand Down Expand Up @@ -373,7 +375,10 @@ Combines all localizations in each pick to one.

Apply expressions to localizations
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This tool allows you to apply expressions to localizations.
This tool allows you to apply expressions to localizations, for example:

- ``x +=1`` will shift all localization by one to the right
- ``x +=1;y+=1`` will shift all localization by one to the right and one up.

DBSCAN
^^^^^^
Expand All @@ -385,6 +390,9 @@ Cluster localizations with the hdbscan clustering algorithm.

SMLM clusterer
^^^^^^^^^^^^^^
Cluster localizations with the custom algorithm designed for SMLM. In short, localizations with the maximum number of neighboring localizations within a user-defined radius are chosen as cluster centers, around which all localizations withing the given radius belong to one cluster. If two or more such clusters overlap, they are combined.

*NOTE:* it is highly recommended to remove any fiducial markers before clustering, to lower clustering time, given they are of no interest to the user. To do that, the markers can be picked and removed using ``Tools > Remove localizations in picks``.

Test clusterer
^^^^^^^^^^^^^^
Expand All @@ -394,11 +402,6 @@ Nearest Neighbor Analysis
^^^^^^^^^^^^^^^^^^^^^^^^^
Calculates distances to the ``k``-th nearest neighbors between two channels (can be the same channel). ``k`` is defined by the user. The distances are stored in nm as a .csv file.

Examples
++++++++
- ``x +=1`` will shift all localization by one to the right
- ``x +=1;y+=1`` will shift all localization by one to the right and one up.

Notes
+++++
Using two variables in one statement is not supported (e.g. ``x = y``) To filter localizations use picasso filter.
Expand Down
Binary file removed picasso/.DS_Store
Binary file not shown.
128 changes: 97 additions & 31 deletions picasso/__main__.py
Original file line number Diff line number Diff line change
Expand Up @@ -445,30 +445,27 @@ def _dbscan(files, radius, min_density):

paths = glob.glob(files)
if paths:
from . import io, postprocess
from h5py import File
from . import io, clusterer
pixelsize = int(input("Enter the camera pixelsize in nm: "))

for path in paths:
print("Loading {} ...".format(path))
locs, info = io.load_locs(path)
clusters, locs = postprocess.dbscan(locs, radius, min_density)
base, ext = os.path.splitext(path)
locs = clusterer.dbscan(locs, radius, min_density, pixelsize)
clusters = clusterer.find_cluster_centers(locs, pixelsize)
base, _ = os.path.splitext(path)
dbscan_info = {
"Generated by": "Picasso DBSCAN",
"Radius": radius,
"Minimum local density": min_density,
}
info.append(dbscan_info)
io.save_locs(base + "_dbscan.hdf5", locs, info)
with File(base + "_dbclusters.hdf5", "w") as clusters_file:
clusters_file.create_dataset("clusters", data=clusters)
io.save_locs(base + "_dbclusters.hdf5", clusters, info)
print(
"Clustering executed. Results are saved in: \n"
+ base
+ "_dbscan.hdf5"
+ "\n"
+ base
+ "_dbclusters.hdf5"
f"{base}_dbscan.hdf5\n"
f"{base}_dbclusters.hdf5"
)


Expand All @@ -477,20 +474,14 @@ def _hdbscan(files, min_cluster, min_samples):

paths = glob.glob(files)
if paths:
from . import io, postprocess
from h5py import File
from . import io, clusterer
pixelsize = int(input("Enter the camera pixelsize in nm: "))

for path in paths:
print("Loading {} ...".format(path))
locs, info = io.load_locs(path)
if hasattr(locs, "z"):
pixelsize = input("Camera pixelsize in nm: ")
pixelsize = float(pixelsize)
else:
pixelsize = None
locs = postprocess.hdbscan(
locs, min_cluster, min_samples, pixelsize
)
locs = clusterer.hdbscan(locs, min_cluster, min_samples, pixelsize)
clusters = clusterer.find_cluster_centers(locs, pixelsize)
base, ext = os.path.splitext(path)
hdbscan_info = {
"Generated by": "Picasso HDBSCAN",
Expand All @@ -499,15 +490,45 @@ def _hdbscan(files, min_cluster, min_samples):
}
info.append(hdbscan_info)
io.save_locs(base + "_hdbscan.hdf5", locs, info)
# with File(base + "_hdbclusters.hdf5", "w") as clusters_file:
# clusters_file.create_dataset("clusters", data=clusters)
io.save_locs(base + "_hdbclusters.hdf5", clusters, info)
print(
"Clustering executed. Results are saved in: \n"
+ base
+ "_hdbscan.hdf5"
+ "\n"
+ base
+ "_hdbclusters.hdf5"
f"{base}_hdbscan.hdf5\n"
f"{base}_hdbclusters.hdf5"
)

def _smlm_clusterer(files, radius, min_locs, basic_fa=False, radius_z=None):
import glob

paths = glob.glob(files)
if paths:
from . import io, clusterer
pixelsize = int(input("Enter the camera pixelsize in nm: "))
if radius_z is not None: # 3D
params = [radius, radius_z, min_locs, 0, basic_fa, 0]
else: # 2D
params = [radius, min_locs, 0, basic_fa, 0]

for path in paths:
print("Loading {} ...".format(path))
locs, info = io.load_locs(path)
locs = clusterer.cluster(locs, params, pixelsize)
clusters = clusterer.find_cluster_centers(locs, pixelsize)
base, ext = os.path.splitext(path)
smlm_cluster_info = {
"Generated by": "Picasso SMLM clusterer",
"Radius_xy": radius,
"Radius_z": radius_z,
"Min locs": min_locs,
"Basic frame analysis": basic_fa,
}
info.append(smlm_cluster_info)
io.save_locs(base + "_clusters.hdf5", locs, info)
io.save_locs(base + "_cluster_centers.hdf5", clusters, info)
print(
"Clustering executed. Results are saved in: \n"
f"{base}_clusters.hdf5\n"
f"{base}_cluster_centers.hdf5"
)


Expand Down Expand Up @@ -545,8 +566,8 @@ def _dark(files):
locs, info = io.load_locs(path)
locs = postprocess.compute_dark_times(locs)
base, ext = os.path.splitext(path)
dbscan_info = {"Generated by": "Picasso Dark"}
info.append(dbscan_info)
d_info = {"Generated by": "Picasso Dark"}
info.append(d_info)
io.save_locs(base + "_dark.hdf5", locs, info)


Expand Down Expand Up @@ -1242,7 +1263,7 @@ def main():
dbscan_parser.add_argument(
"radius",
type=float,
help=("maximal distance between to localizations" " to be considered local"),
help=("maximal distance (camera pixels) between to localizations" " to be considered local"),
)
dbscan_parser.add_argument(
"density",
Expand Down Expand Up @@ -1273,6 +1294,43 @@ def main():
help=("the higher the more points are considered noise"),
)

# SMLM clusterer
smlm_cluster_parser = subparsers.add_parser(
"smlm_cluster",
help="cluster localizations with the custom SMLM clustering algorithm",
)
smlm_cluster_parser.add_argument(
"files",
help=(
"one or multiple hdf5 localization files"
" specified by a unix style path pattern"
),
)
smlm_cluster_parser.add_argument(
"radius",
type=float,
help=("clustering radius (in camera pixels)"),
)
smlm_cluster_parser.add_argument(
"min_locs",
type=int,
help=("minimum number of localizations in a cluster"),
)
smlm_cluster_parser.add_argument(
"basic_fa",
type=bool,
help=("whether or not perform basic frame analysis (sticking event removal)"),
default=False,
)
smlm_cluster_parser.add_argument(
"radius_z",
type=float,
help=(
"clustering radius in axial direction (MUST BE SET FOR 3D!!!)"
),
default=None,
)

# Dark time
dark_parser = subparsers.add_parser(
"dark", help="compute the dark time for grouped localizations"
Expand Down Expand Up @@ -1619,6 +1677,14 @@ def main():
_dbscan(args.files, args.radius, args.density)
elif args.command == "hdbscan":
_hdbscan(args.files, args.min_cluster, args.min_samples)
elif args.command == "smlm_cluster":
_smlm_clusterer(
args.files,
args.radius,
args.min_locs,
args.basic_fa,
args.radius_z,
)
elif args.command == "nneighbor":
_nneighbor(args.files)
elif args.command == "dark":
Expand Down
2 changes: 1 addition & 1 deletion picasso/__version__.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
VERSION_NO = "0.5.6"
VERSION_NO = "0.5.7"
4 changes: 3 additions & 1 deletion picasso/avgroi.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,9 @@ def fit_spots(spots):


def fit_spots_parallel(spots, asynch=False):
n_workers = max(1, int(0.75 * _multiprocessing.cpu_count()))
n_workers = min(
60, max(1, int(0.75 * _multiprocessing.cpu_count()))
) # Python crashes when using >64 cores
n_spots = len(spots)
n_tasks = 100 * n_workers
spots_per_task = [
Expand Down
Loading

0 comments on commit 7f4d285

Please sign in to comment.