The aim with this project was to create a django/celery project using python only from within a docker container.
My considerations about how to set up docker is documented here:
- Dockerfile
- docker-compose.project.yml
- docker-compose.yml
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'djocker.settings')
app = Celery('Djocker task processor')
app.config_from_object('django.conf:settings', namespace='CELERY')
# Looks up for task modules in Django applications and loads them
app.autodiscover_tasks()The setting names matter. 'CELERY' relates to the namespace argument.
CELERY_BROKER and CELERY_BACKEND are URL's to the redis or rabbitmq applications. See also the docker-compose.yml file in this project.
import os
CELERY_BROKER_URL = os.getenv("CELERY_BROKER")
CELERY_RESULT_BACKEND = os.getenv("CELERY_BACKEND")In order for celery to work with django the app needs to be:
- in this position in the project tree, relative to the settings module that you defined at DJANGO_SETTINGS_MODULE.
- defined with this name.
I found this part confusing. The 'celery' command at deployment refers to the file where the app is created, but still you need to import the app in this fashion. Not sure why.
from .celery import app as celery_app
__all__ = ('celery_app',)In this way you can use Celery's full potential.
from djocker.celery import app
@app.task
def run_task(argument):
...The next method is scheduled fine, but you have no access to the result via the result backend if your task is decorated in this fashion.
from celery import shared_task
@shared_task
def unimportant_result():
...