You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Document, test, and actively encourage use of csv.gz, feather and dta.gz formats
Our biggest files are dta files, it's worth verifying that these things work with the gzuse package and helping study authors move to them. This may involve manually gzipping some stuff on the server to avoid re-running.
Likewise evangelising and helping folks move to .gz formats wherever possible.
This is a meta issue to track various ways we might avoid hitting the disk limits of docker, by either reducing files and sizes, or increasing limits.
Some of these are not job-runner specific, but it seemed like job-runner is as good as any central place to track.
Move to using ubuntu: opensafely-core/backend-server#46
Implement
inputs:
proposal: #319This reduces the number of files copied into the volume.
More efficient copying
docker cp
is not fast, and its error prone: Improve performance of copying large numbers of files in/out of Docker volumes #169Document, test, and actively encourage use of
csv.gz
,feather
anddta.gz
formatsgzuse
package and helping study authors move to them. This may involve manually gzipping some stuff on the server to avoid re-running.Likewise evangelising and helping folks move to .gz formats wherever possible.
Implement archiving workspaces:
The text was updated successfully, but these errors were encountered: