Skip to content
This repository has been archived by the owner. It is now read-only.

Way to run tasks #1

Closed
hirokiky opened this issue Jul 1, 2015 · 8 comments
Closed

Way to run tasks #1

hirokiky opened this issue Jul 1, 2015 · 8 comments

Comments

@hirokiky
Copy link

hirokiky commented Jul 1, 2015

Hi, I don't know how to run tasks by using this Dockerfile.

  • Where should I place tasks.py file to register tasks to run?
  • How can I notice new tasks for this container?

If you know, please tell me. 😃

@kjr
Copy link

kjr commented Aug 14, 2015

You can mount your tasks.py into the container in the users home directory (/home/user) and I think use the CELERY_APP environment variable (this worked for me, but I was browsing the issues here hoping to find a list of available environment variables...)

docker run --volume /path/to/tasks.py:/home/user/tasks.py --env CELERY_APP=tasks celery

Not sure how you would notice new tasks.. any change to the file on the host would be seen by the container, but I don't think it will reload. Still, it's quick to restart the container...

@AdrianRibao
Copy link

This needs to be documented.

How can this be used with django for example?

@tianon
Copy link
Member

tianon commented Jan 22, 2016

+1 for some documentation 😇 (what we've got is impossibly sparse right now: https://github.com/docker-library/docs/blob/master/celery/content.md)

@esseti
Copy link

esseti commented Jun 30, 2016

that's tricky. i tried with docker-compose but i can't find a way. has anyone been able to run the celery in docker and use the integration with django??

@bittner
Copy link

bittner commented Mar 19, 2017

Using this image seems ridiculous. If you have an application container, as you usually have with Django, you need all dependencies (things you import in tasks.py) installed in this container again.

That's why other projects (e.g. cookiecutter-django) reuse the application container for Celery, and only run a different command (command: celery ... worker) against it with docker-compose.

@tianon
Copy link
Member

tianon commented Mar 20, 2017 via email

@tianon
Copy link
Member

tianon commented Apr 5, 2017

I've opened #12 to discuss this image's deprecation. In the meantime, I'm going to close this issue since I think the original question is as solved as it's going to get. 👍

@ArnaudPel
Copy link

Agreed, it's not going to be much more difficult to replace this image with a build on a standard Python image with celery added to pip's requirements.txt for example.

Actually, doing so in the first place would have saved me two hours yesterday: This celery docker image ignores the broker url when provided from inside python like so app = Celery('tasks', broker='my url'), and only allows it to be set from the docker env variable CELERY_BROKER_URL. Ok this is kinda hinted in the doc, my bad for not reading carefully. But it's clearly counter-intuitive and time consuming to find this out when having your nose in the code and not expecting some of the parameters to be forced but not the others (the backend argument is not forced like this for example, although it has the same kind of properties that could refer to other docker containers: host, port, ...).

Anyway I'm rambling again, I came here to comment about this parameters forcing but since the whole thing is deprecated I reckon there's no need.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants