You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We can only access our storage indirectly via sshfs where we can run dds and in that place we have limited storage.
Starting dds data put on the sshfs mounted network storage means the data has to be read and written over the network several times before it is finally sent, which puts a lot of load on our network and our system and slows things down.
We would need considerable amounts of storage on the edge server that mounts the hpc storage to do the packaging on the local disk.
Proposed solution:
If a few functions can be imported and isolated into a script that can prepare an archive that should be streamed to the cloud storage and dds data put can get a flag that tells it that the data is already prepared offline it could proceed with sending passing the data to the storage endpoint.
I am willing to try to put together a PR for this if I can be pointed in the right direction and get some feedback on how this functionality aligns with the goals of the team!
The text was updated successfully, but these errors were encountered:
Problem:
We can only access our storage indirectly via sshfs where we can run dds and in that place we have limited storage.
Starting
dds data put
on the sshfs mounted network storage means the data has to be read and written over the network several times before it is finally sent, which puts a lot of load on our network and our system and slows things down.We would need considerable amounts of storage on the edge server that mounts the hpc storage to do the packaging on the local disk.
Proposed solution:
If a few functions can be imported and isolated into a script that can prepare an archive that should be streamed to the cloud storage and
dds data put
can get a flag that tells it that the data is already prepared offline it could proceed with sending passing the data to the storage endpoint.I am willing to try to put together a PR for this if I can be pointed in the right direction and get some feedback on how this functionality aligns with the goals of the team!
The text was updated successfully, but these errors were encountered: