You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello. We have django app for which we have "django envs" which can be though of as Rails envs. This allows us to run the app locally against various databases (among other things). We would like to run dramatiq on at least two of these (local) envs, using a local redis instance as both broker and backend. We would like to avoid having these two envs colliding. One idea that came to mind was to do something like:
(we have a separate setting config file for each env, essentially mimicking what happens in Rails.)
and using a different redis db for each env, but we are using the various dbs for other apps (typically sidekiq) and this seems like a suboptimal solution.
Is there a natural (or at least recommended) way to isolate the jobs coming from separate envs (and maybe results as well) ?
Thanks for reading
The text was updated successfully, but these errors were encountered:
Hello. We have django app for which we have "django envs" which can be though of as Rails envs. This allows us to run the app locally against various databases (among other things). We would like to run dramatiq on at least two of these (local) envs, using a local redis instance as both broker and backend. We would like to avoid having these two envs colliding. One idea that came to mind was to do something like:
(we have a separate setting config file for each env, essentially mimicking what happens in Rails.)
and using a different redis db for each env, but we are using the various dbs for other apps (typically sidekiq) and this seems like a suboptimal solution.
Is there a natural (or at least recommended) way to isolate the jobs coming from separate envs (and maybe results as well) ?
Thanks for reading
The text was updated successfully, but these errors were encountered: