globus is a Python script that interfaced with the APS Data Management System (e.g. Voyager) and a generic Globus server (e.g. Petrel). It reads beamline PVs to set up a Data Management experiment, create directories on the data acquisition and analysis machines, manage users for the experiment, send e-mails to users with information on how to get their data from Voyager or Petrel, and manage automated data transfer (termed DAQs) from the analysis machine to Voyager. The notification email can be sent to all users listed in the beamline schedule by using the --schedule option.
year-month, pi_last_name and pi_email are read from the EPIC PVs defined in the 'epics' section of the globus config file. By default these PVs are served by TomoScan and can be automatically updated for the current user using dmagic tag.
globus depends on the Globus SDK and paramiko. These can be downloaded from conda via:
$ conda install -c conda-forge globus-sdk $ conda install -c anaconda paramiko
Install from Anaconda python3.x, then install globus:
$ git clone https://github.com/xray-imaging/globus.git $ cd globus $ python setup.py install
You will also need to have the APS Data Management system installed for your beamline; contact the SDM group for this installation. Once installed you can run globus in a terminal with:
$ source /home/dm_bm/etc/dm.setup.sh $ globus -h
Alternatively you can download the Data Management API via conda
conda install -c aps-anl-tag aps-dm-api
There are also several environment variables that must be set for the DM API to work properly. They can be found in the /home/dm_bm/etc/dm.conda.setup.sh script. Copy everything in this script except the change to the PATH to your account's ~/.bashrc file.
- customize the email to the user by editing the message
- for automatic retrieval of user information from the APS scheduling system see dmagic tag.
Once the DMagic medm screen is synchronized with the APS scheduling system and contains valid values like:
you can run globus as follows:
$ source /home/dm_bm/etc/dm.setup.sh
then:
globus -h for help
- globus set
- Creates a globus.conf default file
- globus init
Initialize data mamagement. If using a DM server:
Create an experiment in the DM system Add users to this experiment
- if using a Globus server:
- Create or refresh a globus access token Create directory on the globus server
- globus dirs
- Checks for directories on the analysis and detector computers and creates them, as needed
- globus email
- E-mails all users on an experiment with information on how to access their data
- globus start_daq
- Starts automated file upload from the analysis computer to a DM server
- globus stop_daq
- Stops automated file uploads for this experiment to a DM server
- globus add_user --edit-user-badge 123456
- Adds the user with the badge 123456 to a DM experiment
- globus list_users
- Lists the users (name and badge numbers) that are part of the DM experiment
- globus remove_user --edit-user-badge 123456
Removes the user with badge 123456 from the DM experiment
data collection and data analysis machines need to be configured in the local section of the config file. The directory creation requires ssh access to the data collection and data analysis machines, if prefered not to use a password see SSH login without password.
For DM server:
$ globus init $ globus dirs $ globus list_users $ globus add_user --edit-user-badge 123456 $ globus remove_user --edit-user-badge 987654 $ globus email
For Globus server:
$ globus init $ globus dirs $ globus email