Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

running on other OS than Linux #122

Open
tischi opened this issue Jan 10, 2024 · 2 comments
Open

running on other OS than Linux #122

tischi opened this issue Jan 10, 2024 · 2 comments

Comments

@tischi
Copy link

tischi commented Jan 10, 2024

@constantinpape @martinschorb

I did not manage to run the conversion to OME-Zarr on my Mac (see MoBIE mattermost channel).

This raises a general question: on which OS does the code here run?

@constantinpape
Copy link
Contributor

In principle everything should run on Mac and Windows and there are unit tests for it that run through without issues.
https://github.com/mobie/mobie-utils-python/actions/runs/7186625802/job/19572434858

Maybe these don't cover all the cases for ome-zarr conversion?! To fix this I would need a minimal self-contained reproducible example where it fails.

@eschalnajmi
Copy link

Hi! I'm facing a very similar issue right now where code works as a .py on linux but not on macOS or windows. However, the code works in mac and windows when run as .ipynb file. It seems to get stuck on the downscaling when run as a .py file containing the exact same code. Potentially jupyter notebook is adding a layer of virtualisation?

The exact error traceback:
`/Users/najmie/miniconda3/envs/mobie3/bin/python /Users/najmie/Downloads/mobie_python_project/project_creation.py
najmie@CH7H1X6L5F mobie_python_project % /Users/najmie/miniconda3/envs/mobie3/bin/python /Users/najmie/Downloads/mobie_python_project/project_creation.py
Only one channel found in data/em_20nm_z_40_145.tif - no conversion needed :)
/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/luigi/parameter.py:296: UserWarning: Parameter "dtype" with value "None" is not of type string.
warnings.warn('Parameter "{}" with value "{}" is not of type string.'.format(param_name, param_value))
DEBUG: Checking if DownscalingWorkflow(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, target=local, dependency=DummyTask, input_path=data/em_20nm_z_40_145.tif, input_key=, scale_factors=[[1, 2, 2], [1, 2, 2], [1, 2, 2], [1, 2, 2]], halos=[[1, 2, 2], [1, 2, 2], [1, 2, 2], [1, 2, 2]], dtype=None, int_to_uint=False, metadata_format=ome.zarr, metadata_dict={"resolution": [10.0, 10.0, 10.0], "unit": "nanometer", "setup_name": "em_20nm_z_40_145"}, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key_prefix=, force_copy=False, skip_existing_levels=False, scale_offset=0) is complete
/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/luigi/parameter.py:296: UserWarning: Parameter "scale_factor" with value "(1, 2, 2)" is not of type string.
warnings.warn('Parameter "{}" with value "{}" is not of type string.'.format(param_name, param_value))
DEBUG: Checking if WriteDownscalingMetadata(tmp_folder=tmp_dataset1_em_20nm_z_40_145, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, scale_factors=[[1, 2, 2], [1, 2, 2], [1, 2, 2], [1, 2, 2]], dependency=DownscalingLocal, metadata_format=ome.zarr, metadata_dict={"resolution": [10.0, 10.0, 10.0], "unit": "nanometer", "setup_name": "em_20nm_z_40_145"}, output_key_prefix=, scale_offset=0, prefix=downscaling) is complete
INFO: Informed scheduler that task DownscalingWorkflow_tmp_dataset1_em__DummyTask_None_f01a3395a1 has status PENDING
DEBUG: Checking if DownscalingLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, input_key=s3, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s4, scale_factor=(1, 2, 2), scale_prefix=s4, halo=[1, 2, 2], effective_scale_factor=[1, 16, 16], dimension_separator=/, dependency=DownscalingLocal) is complete
INFO: Informed scheduler that task WriteDownscalingMetadata_DownscalingLocal___resolution_____ome_zarr_784397d098 has status PENDING
DEBUG: Checking if DownscalingLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, input_key=s2, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s3, scale_factor=(1, 2, 2), scale_prefix=s3, halo=[1, 2, 2], effective_scale_factor=[1, 8, 8], dimension_separator=/, dependency=DownscalingLocal) is complete
INFO: Informed scheduler that task DownscalingLocal_tmp_dataset1_em__DownscalingLocal___d657c21b11 has status PENDING
DEBUG: Checking if DownscalingLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, input_key=s1, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s2, scale_factor=(1, 2, 2), scale_prefix=s2, halo=[1, 2, 2], effective_scale_factor=[1, 4, 4], dimension_separator=/, dependency=DownscalingLocal) is complete
INFO: Informed scheduler that task DownscalingLocal_tmp_dataset1_em__DownscalingLocal___d82adb898c has status PENDING
DEBUG: Checking if DownscalingLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, input_key=s0, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s1, scale_factor=(1, 2, 2), scale_prefix=s1, halo=[1, 2, 2], effective_scale_factor=[1, 2, 2], dimension_separator=/, dependency=CopyVolumeLocal) is complete
INFO: Informed scheduler that task DownscalingLocal_tmp_dataset1_em__DownscalingLocal___7fa9471af6 has status PENDING
DEBUG: Checking if CopyVolumeLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=data/em_20nm_z_40_145.tif, input_key=, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s0, prefix=initial_scale, dtype=None, int_to_uint=False, fit_to_roi=False, effective_scale_factor=[], dimension_separator=/, dependency=DummyTask) is complete
INFO: Informed scheduler that task DownscalingLocal_tmp_dataset1_em__CopyVolumeLocal___452731ba01 has status PENDING
DEBUG: Checking if DummyTask() is complete
INFO: Informed scheduler that task CopyVolumeLocal_tmp_dataset1_em__DummyTask___07bedbbf06 has status PENDING
INFO: Informed scheduler that task DummyTask__99914b932b has status DONE
INFO: Done scheduling tasks
INFO: Running Worker with 1 processes
DEBUG: Asking scheduler for work...
DEBUG: Pending tasks: 7
INFO: [pid 11765] Worker Worker(salt=8065394207, workers=1, host=CH7H1X6L5F, username=najmie, pid=11765) running CopyVolumeLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=data/em_20nm_z_40_145.tif, input_key=, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s0, prefix=initial_scale, dtype=None, int_to_uint=False, fit_to_roi=False, effective_scale_factor=[], dimension_separator=/, dependency=DummyTask)
Only one channel found in data/em_20nm_z_40_145.tif - no conversion needed :)
/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/luigi/parameter.py:296: UserWarning: Parameter "dtype" with value "None" is not of type string.
warnings.warn('Parameter "{}" with value "{}" is not of type string.'.format(param_name, param_value))
DEBUG: Checking if DownscalingWorkflow(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, target=local, dependency=DummyTask, input_path=data/em_20nm_z_40_145.tif, input_key=, scale_factors=[[1, 2, 2], [1, 2, 2], [1, 2, 2], [1, 2, 2]], halos=[[1, 2, 2], [1, 2, 2], [1, 2, 2], [1, 2, 2]], dtype=None, int_to_uint=False, metadata_format=ome.zarr, metadata_dict={"resolution": [10.0, 10.0, 10.0], "unit": "nanometer", "setup_name": "em_20nm_z_40_145"}, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key_prefix=, force_copy=False, skip_existing_levels=False, scale_offset=0) is complete
/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/luigi/parameter.py:296: UserWarning: Parameter "scale_factor" with value "(1, 2, 2)" is not of type string.
warnings.warn('Parameter "{}" with value "{}" is not of type string.'.format(param_name, param_value))
DEBUG: Checking if WriteDownscalingMetadata(tmp_folder=tmp_dataset1_em_20nm_z_40_145, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, scale_factors=[[1, 2, 2], [1, 2, 2], [1, 2, 2], [1, 2, 2]], dependency=DownscalingLocal, metadata_format=ome.zarr, metadata_dict={"resolution": [10.0, 10.0, 10.0], "unit": "nanometer", "setup_name": "em_20nm_z_40_145"}, output_key_prefix=, scale_offset=0, prefix=downscaling) is complete
INFO: Informed scheduler that task DownscalingWorkflow_tmp_dataset1_em__DummyTask_None_f01a3395a1 has status PENDING
DEBUG: Checking if DownscalingLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, input_key=s3, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s4, scale_factor=(1, 2, 2), scale_prefix=s4, halo=[1, 2, 2], effective_scale_factor=[1, 16, 16], dimension_separator=/, dependency=DownscalingLocal) is complete
INFO: Informed scheduler that task WriteDownscalingMetadata_DownscalingLocal___resolution_____ome_zarr_784397d098 has status PENDING
DEBUG: Checking if DownscalingLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, input_key=s2, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s3, scale_factor=(1, 2, 2), scale_prefix=s3, halo=[1, 2, 2], effective_scale_factor=[1, 8, 8], dimension_separator=/, dependency=DownscalingLocal) is complete
INFO: Informed scheduler that task DownscalingLocal_tmp_dataset1_em__DownscalingLocal___d657c21b11 has status PENDING
DEBUG: Checking if DownscalingLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, input_key=s1, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s2, scale_factor=(1, 2, 2), scale_prefix=s2, halo=[1, 2, 2], effective_scale_factor=[1, 4, 4], dimension_separator=/, dependency=DownscalingLocal) is complete
INFO: Informed scheduler that task DownscalingLocal_tmp_dataset1_em__DownscalingLocal___d82adb898c has status PENDING
DEBUG: Checking if DownscalingLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, input_key=s0, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s1, scale_factor=(1, 2, 2), scale_prefix=s1, halo=[1, 2, 2], effective_scale_factor=[1, 2, 2], dimension_separator=/, dependency=CopyVolumeLocal) is complete
INFO: Informed scheduler that task DownscalingLocal_tmp_dataset1_em__DownscalingLocal___7fa9471af6 has status PENDING
DEBUG: Checking if CopyVolumeLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=data/em_20nm_z_40_145.tif, input_key=, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s0, prefix=initial_scale, dtype=None, int_to_uint=False, fit_to_roi=False, effective_scale_factor=[], dimension_separator=/, dependency=DummyTask) is complete
INFO: Informed scheduler that task DownscalingLocal_tmp_dataset1_em__CopyVolumeLocal___452731ba01 has status PENDING
INFO: Informed scheduler that task CopyVolumeLocal_tmp_dataset1_em__DummyTask___07bedbbf06 has status DONE
INFO: Done scheduling tasks
INFO: Running Worker with 1 processes
DEBUG: Asking scheduler for work...
DEBUG: Pending tasks: 6
INFO: [pid 11783] Worker Worker(salt=2341089979, workers=1, host=CH7H1X6L5F, username=najmie, pid=11783) running DownscalingLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, input_key=s0, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s1, scale_factor=(1, 2, 2), scale_prefix=s1, halo=[1, 2, 2], effective_scale_factor=[1, 2, 2], dimension_separator=/, dependency=CopyVolumeLocal)
ERROR: [pid 11783] Worker Worker(salt=2341089979, workers=1, host=CH7H1X6L5F, username=najmie, pid=11783) failed DownscalingLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, input_key=s0, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s1, scale_factor=(1, 2, 2), scale_prefix=s1, halo=[1, 2, 2], effective_scale_factor=[1, 2, 2], dimension_separator=/, dependency=CopyVolumeLocal)
Traceback (most recent call last):
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/luigi/worker.py", line 210, in run
new_deps = self._run_get_new_deps()
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/luigi/worker.py", line 138, in _run_get_new_deps
task_gen = self.task.run()
^^^^^^^^^^^^^^^
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/cluster_tools/cluster_tasks.py", line 95, in run
raise e
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/cluster_tools/cluster_tasks.py", line 81, in run
self.run_impl()
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/cluster_tools/downscaling/downscaling.py", line 184, in run_impl
self.submit_jobs(n_jobs, self.scale_prefix)
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/cluster_tools/cluster_tasks.py", line 567, in submit_jobs
tasks = [pp.submit(self._submit, job_id, job_prefix) for job_id in range(n_jobs)]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/cluster_tools/cluster_tasks.py", line 567, in
tasks = [pp.submit(self._submit, job_id, job_prefix) for job_id in range(n_jobs)]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/concurrent/futures/process.py", line 808, in submit
self._adjust_process_count()
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/concurrent/futures/process.py", line 767, in _adjust_process_count
self._spawn_process()
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/concurrent/futures/process.py", line 785, in _spawn_process
p.start()
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/multiprocessing/process.py", line 121, in start
self._popen = self._Popen(self)
^^^^^^^^^^^^^^^^^
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/multiprocessing/context.py", line 288, in _Popen
return Popen(process_obj)
^^^^^^^^^^^^^^^^^^
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/multiprocessing/popen_spawn_posix.py", line 32, in init
super().init(process_obj)
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/multiprocessing/popen_fork.py", line 19, in init
self._launch(process_obj)
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/multiprocessing/popen_spawn_posix.py", line 42, in _launch
prep_data = spawn.get_preparation_data(process_obj._name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/multiprocessing/spawn.py", line 164, in get_preparation_data
_check_not_importing_main()
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/multiprocessing/spawn.py", line 140, in _check_not_importing_main
raise RuntimeError('''
RuntimeError:
An attempt has been made to start a new process before the
current process has finished its bootstrapping phase.

    This probably means that you are not using fork to start your
    child processes and you have forgotten to use the proper idiom
    in the main module:

        if __name__ == '__main__':
            freeze_support()
            ...

    The "freeze_support()" line can be omitted if the program
    is not going to be frozen to produce an executable.

    To fix this issue, refer to the "Safe importing of main module"
    section in https://docs.python.org/3/library/multiprocessing.html

DEBUG: 1 running tasks, waiting for next task to finish
INFO: Informed scheduler that task DownscalingLocal_tmp_dataset1_em__CopyVolumeLocal___452731ba01 has status FAILED
DEBUG: Asking scheduler for work...
DEBUG: Done
DEBUG: There are no more tasks to run at this time
DEBUG: There are 6 pending tasks possibly being run by other workers
DEBUG: There are 6 pending tasks unique to this worker
DEBUG: There are 6 pending tasks last scheduled by this worker
INFO: Worker Worker(salt=2341089979, workers=1, host=CH7H1X6L5F, username=najmie, pid=11783) was stopped. Shutting down Keep-Alive thread
INFO:
===== Luigi Execution Summary =====

Scheduled 7 tasks of which:

  • 1 complete ones were encountered:
    • 1 CopyVolumeLocal(...)
  • 1 failed:
    • 1 DownscalingLocal(...)
  • 5 were left pending, among these:
    • 5 had failed dependencies:
      • 3 DownscalingLocal(...)
      • 1 DownscalingWorkflow(...)
      • 1 WriteDownscalingMetadata(...)

This progress looks :( because there were failed tasks

===== Luigi Execution Summary =====

Could not add em_20nm_z_40_145 to project :(
ERROR: [pid 11765] Worker Worker(salt=8065394207, workers=1, host=CH7H1X6L5F, username=najmie, pid=11765) failed CopyVolumeLocal(tmp_folder=tmp_dataset1_em_20nm_z_40_145, max_jobs=1, config_dir=tmp_dataset1_em_20nm_z_40_145/configs, input_path=data/em_20nm_z_40_145.tif, input_key=, output_path=test_project/data/dataset1/images/ome-zarr/em_20nm_z_40_145.ome.zarr, output_key=s0, prefix=initial_scale, dtype=None, int_to_uint=False, fit_to_roi=False, effective_scale_factor=[], dimension_separator=/, dependency=DummyTask)
concurrent.futures.process._RemoteTraceback:
"""
Traceback (most recent call last):
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/concurrent/futures/process.py", line 261, in _process_worker
r = call_item.fn(*call_item.args, **call_item.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/cluster_tools/cluster_tasks.py", line 547, in _submit
assert os.path.exists(script_path), script_path
AssertionError: tmp_dataset1_em_20nm_z_40_145/copy_volume.py
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/cluster_tools/cluster_tasks.py", line 81, in run
self.run_impl()
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/cluster_tools/copy_volume/copy_volume.py", line 163, in run_impl
self.submit_jobs(n_jobs, self.prefix)
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/cluster_tools/cluster_tasks.py", line 568, in submit_jobs
[t.result() for t in tasks]
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/cluster_tools/cluster_tasks.py", line 568, in
[t.result() for t in tasks]
^^^^^^^^^^
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/concurrent/futures/_base.py", line 456, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
AssertionError: tmp_dataset1_em_20nm_z_40_145/copy_volume.py

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/luigi/worker.py", line 210, in run
new_deps = self._run_get_new_deps()
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/luigi/worker.py", line 138, in _run_get_new_deps
task_gen = self.task.run()
^^^^^^^^^^^^^^^
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/cluster_tools/cluster_tasks.py", line 92, in run
self._write_log("task failed in run_impl with %s" % msg)
File "/Users/najmie/miniconda3/envs/mobie3/lib/python3.11/site-packages/cluster_tools/cluster_tasks.py", line 281, in _write_log
with open(log_file, 'a') as f:
^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: 'tmp_dataset1_em_20nm_z_40_145/copy_volume_initial_scale.log'
DEBUG: 1 running tasks, waiting for next task to finish
INFO: Informed scheduler that task CopyVolumeLocal_tmp_dataset1_em__DummyTask___07bedbbf06 has status FAILED
DEBUG: Asking scheduler for work...
DEBUG: Done
DEBUG: There are no more tasks to run at this time
DEBUG: There are 7 pending tasks possibly being run by other workers
DEBUG: There are 7 pending tasks unique to this worker
DEBUG: There are 7 pending tasks last scheduled by this worker
INFO: Worker Worker(salt=8065394207, workers=1, host=CH7H1X6L5F, username=najmie, pid=11765) was stopped. Shutting down Keep-Alive thread
INFO:
===== Luigi Execution Summary =====

Scheduled 8 tasks of which:

  • 1 complete ones were encountered:
    • 1 DummyTask()
  • 1 failed:
    • 1 CopyVolumeLocal(...)
  • 6 were left pending, among these:
    • 6 had failed dependencies:
      • 4 DownscalingLocal(...)
      • 1 DownscalingWorkflow(...)
      • 1 WriteDownscalingMetadata(...)

This progress looks :( because there were failed tasks

===== Luigi Execution Summary =====

Could not add em_20nm_z_40_145 to project :( `

The code being run is in this repo: https://github.com/eschalnajmi/load_mobie_project

If anyone has figured out a way to fix this easily I would very very much appreciate it :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants