How to Setup Django with Celery and Redis to run Asynchronous Tasks

Learn the process of setting up and configuring Celery and Redis for handling long-running processes in a Django app.

Picture of Nsikak Imoh, author of Macsika Blog
Plain background with the text How to Setup Django with Celery and Redis to run Asynchronous Tasks
Plain background with the text How to Setup Django with Celery and Redis to run Asynchronous Tasks

This post is part of the tutorial series titled Learn to Use Django with FastAPI Frameworks

Table of Content

When you work with Django or any framework, you might come across a functionality that requires you to develop a long-running process.

If a long-running process is part of your application's workflow, it is important to not block the response of your application.

Rather than blocking the response, you should handle it in the background, outside the normal request/response flow.

In this post, you will learn the process of setting up and configuring Celery and Redis for handling long-running processes in a Django app.

This will help you gain a general understanding of why celery message queues are valuable, along with how to utilize celery in conjunction with Redis in a Django application.

Before we being, let's analyze a sample problem we might encounter that needs a developer to utilize this workflow:

Why do you Need Celery to Handle Tasks Asynchronously?

Imagine we have a web application that requires users to submit a profile picture on registration, or an application that generates thumbnails of images submitted by users.

If our application processes this image and sends a confirmation email directly to the request handler, this would cause the end-user of our application to wait for the process to complete before the page loads or updates.

Doing this will end up creating a bad user experience for our end-users due to the long wait time.

What we can do to mitigate this is to pass these processes off to a task queue and let a separate worker process handle it.

This will allow us to immediately send a response back to the client.

The end-user can then do other things on the client-side of our application while the processing takes place.

Additionally, our application is also free to respond to requests from other users and clients.

What is Celery?

Celery is an asynchronous task queue with a focus on real-time processing, while also supporting task scheduling.

Celery is a Python software package.

It is driven by the information contained in messages that are produced in the application code.

What is the use of Celery?

We use celery to asynchronously handle long-running processes outside the normal HTTP request/response flow, in a background process.

Here are examples of tasks that celery is used for:

  1. Running machine learning models
  2. Sending confirmation emails
  3. Scraping and crawling
  4. Analyzing data
  5. Processing images
  6. Generating reports

What is Redis Message Broker?

Redis is a performant, in-memory, key-value data store.

Specifically, Redis is used to store messages produced by the application code describing the work to be done in the Celery task queue.

Redis also serves as storage of results coming off the celery queues, which are then retrieved by consumers of the queue.

Celery is best used in conjunction with a storage solution that is often referred to as a message broker.

A common message broker that is used with celery is Redis. There is another message broker called RabbitMQ, but our focus in this tutorial is on Redis.

How to Set up Django with Celery, and Redis

To run this workflow, we will need to install and set up Redis, Celery, and Django in our development environment.

Each of these tools will run on a separate server.

1. Install Redis

The first step in our procedure is to install Redis on our machine.

How to Install Redis on Windows

  1. Download the Redis zip file and unzip it in some directory.
  2. Launch the file named redis-server.exe by double-clicking, which opens it in a command prompt window.
  3. Next, locate a different file named redis-cli.exe and double-click it to open the program in a separate command prompt window.
  4. Run a test to ensure the client can talk to the server by issuing the command ping within the command window running the CLI client. If it is correctly installed, you should get a response PONG.

How to Install Redis on macOS and Linux

  1. Download the Redis tarball file and extract it in some directory or with Mac OSX, you can install it with Homebrew.
  2. Run the make file with make install to build the program.
  3. Run the redis-server command in a new terminal window.
  4. Run redis-cli in another terminal window
  5. Test to ensure the client can talk to the server by issuing the command ping within the terminal window running the CLI client. If it is correctly installed, you should get a response PONG.

2. Install Python Virtual Env and Dependencies

We will create a Python3 virtual environment to install the dependency packages necessary for this project.

  • Django
  • Celery
  • Redis
  • Pillow is a Python package for image processing. We will use it later in this tutorial to demo a real use case for celery tasks.
  • Django Widget Tweaks is a Django plugin for providing flexibility in how form inputs are rendered.

To begin, create a directory to hold the project called image_processor, then inside it, create a virtual environment.

mkdir image_processor
cd image_processor
mkdir src
python3 -m venv venv 
source venv/bin/activate
Highlighted code sample.

Next, run the command below to install the following Python packages. The next command stores our requirements in a requirement.txt file.

pip install Django Celery redis Pillow django-widget-tweaks
pip freeze > requirements.txt
Highlighted code sample.

3. Set Up the Django Project

Create a Django project named core, then a Django app named image_processor.

(venv) $ django-admin startproject core . # take note of the dot at the end.
(venv) $ python manage.py startapp image_processor
Highlighted code sample.

Add the image_processor and widget_tweaks app to the list of INSTALLED_APPS in the core project's settings.py module.

INSTALLED_APPS = [
    'django.contrib.admin',
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    'image_processor',
    'widget_tweaks',
]
Highlighted code sample.

At this point, if we run the command below,

tree -I venv
Highlighted code sample.

The directory structure will look as follows:

└── src
    ├── core
    │   ├── __init__.py
    │   ├── settings.py
    │   ├── urls.py
    │   └── wsgi.py
    ├── manage.py
    └── image_processor
        ├── __init__.py
        ├── admin.py
        ├── apps.py
        ├── migrations
        │   └── __init__.py
        ├── models.py
        ├── tests.py
        └── views.py
Highlighted code sample.

4. Configure Celery for the Django Project

To set up Celery within this Django project, we will follow conventions described in the Celery docs.

Create a new module in src/core/celery.py.

Within this new Python module, import the os package. This will be is used to associate a Celery environment variable called DJANGO_SETTINGS_MODULE with the Django project's settings module.

Next, import the Celery class from the celery package. Create an object of the Celery class to create the celery_app instance variable.

The code block below should be in src/core/celery.py

import os
from celery import Celery

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'image_parroter.settings')

celery_app = Celery('image_parroter')
celery_app.config_from_object('django.conf:settings', namespace='CELERY')
celery_app.autodiscover_tasks()

Highlighted code sample.

Next, update the Celery application's configuration with settings by placing it in the Django project's settings file identifiable with a ‘CELERY_' prefix at the very bottom.

These settings tell Celery to use Redis as the message broker, as well as where to connect to it at.

They also tell Celery to expect messages to be passed back and forth between the Celery task queues and Redis message broker to be in the mime type of application/json.

The code block below should be in src/core/settings.py

# celery settings
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
Highlighted code sample.

Finally, configure the newly created celery_app instance to auto-discover tasks within the project.

This will ensure that the previously created and configured celery application gets injected into the Django application when it runs.

To do this, import the Celery application within the Django project's main init.py script and register it explicitly as a namespace symbol within the “core” Django package.

The code block below should be in src/core/init.py

from .celery import celery_app

__all__ = ('celery_app',)
Highlighted code sample.

5. Set up celery tasks module

Create a new module named tasks.py within the “image_processor” application.

Inside the tasks.py module, import the shared_tasks function decorator and use it to define a celery task function called adding_task, as shown below.

from celery import shared_task

@shared_task
def adding_task(x, y):
    return x + y
Highlighted code sample.

6. Test The Set Up

To test our set-up, we will need to run 3 different server environments for Redis, Celery, and Django Respectively.

  • In one terminal, run the redis-server like so:

    $ redis-server
    
    Highlighted code sample.
  • In a second terminal, launch the celery program with an active instance of the Python virtual environment installed previously, in the project's root package directory where the manage.py is located.

    celery worker -A image_processor --loglevel=info
    
    Highlighted code sample.
  • In the third and final terminal, launch the Django Python shell and test out my adding_task, like so:

    (venv) $ python manage.py shell
    >>> from thumbnailer.tasks import adding_task
    >>> task = adding_task.delay(2, 5)
    >>> print(f"id={task.id}, state={task.state}, status={task.status}")
    >>> task.get()
    
    Highlighted code sample.

Note the use of the .delay(...) method on the adding_task object.

This is the common way in celery to pass any necessary parameters to the task object being worked with, as well as initiate sending it off to the message broker and task queue.

The result of calling the .delay(...) method is a promise-like return value of the type celery.result.AsyncResult.

This return value holds information such as the id of the task, its execution state, and the status of the task along with the ability to access any results produced by the task via the .get() method as shown in the example above.

Wrap Off

The aim of this post is to help you gain a general understanding of why celery message queues are valuable, along with how to utilize celery in conjunction with Redis in a Django application.

You also learn how to set up celery with a Redis broker in a Django application.

If you learned from this tutorial, or it helped you in any way, please consider sharing and subscribing to our newsletter.

Get the Complete Code of Django and FastAPI Combo Tutorials on Github.

Connect with me.

Need an engineer on your team to grease an idea, build a great product, grow a business or just sip tea and share a laugh?