Huey, a lightweight task queue for python

Note

Just a heads-up, this post contains code examples for an out-of-date version of huey. If you're interested in seeing current example code, check out the documentation or this blog post.

Preface

At my job we've been doing a quarterly hackday for almost a year now. My coworkers have made some amazing stuff, and its nice to have an entire day dedicated to hacking on ... well, whatever you want. Tomorrow marks the 4th hackday and I need to scrounge up a good project, but in the meantime I thought I'd write a post about what I did last time around -- a lightweight python task queue that has an API similar to celery.

I've called it huey (which also turns out to be the name of my kitten).

Design goals

The goal of the project was to keep it simple while not skimping on features (~2K lines). At the moment the project does the following:

Backend storages implement a simple API, currently the only implementation uses Redis but adding one that uses the database would be a snap.

The other main goal of the project was to have it work easily for any python application (I've been into using flask lately), but come with baked-in support for django. Because of django's centralized configuration and conventions for loading modules, the django API is simpler than the python one, but hopefully both are reasonably straightforward.

Example projects

The API

Like celery, huey uses decorators to mark tasks for execution out-of-band. Taking a look at the getting started guide, here is a simple task that simply "counts some beans":

# commands.py
from huey.decorators import queue_command

from config import invoker # import the invoker we instantiated in config.py

@queue_command(invoker)
def count_beans(number):
    return 'Counted %s beans' % number

You might be wondering what the hell an invoker is. Well, that is basically the name I gave the object that routes function calls to the task queue. It is defined in a simple python config module:

# config.py
from huey.backends.redis_backend import RedisBlockingQueue
from huey.bin.config import BaseConfiguration
from huey.queue import Invoker


queue = RedisBlockingQueue('test-queue', host='localhost', port=6379)
invoker = Invoker(queue)


class Configuration(BaseConfiguration):
    QUEUE = queue
    THREADS = 4

If you were wanting to use this in a django project, the API is a bit simpler. Configuration occurs using the normal django settings file and the invoker is abstracted away. The biggest difference is that you import from "huey.djhuey" instead of "huey":

from huey.djhuey.decorators import queue_command

@queue_command
def count_beans(number):
    return 'Counted %s beans' % number

The requisite settings follow similar pattern to those used by the generic python API:

INSTALLED_APPS = [
    'huey.djhuey',
    # ... your apps here
]

HUEY_CONFIG = {
    'QUEUE': 'huey.backends.redis_backend.RedisBlockingQueue',
    'QUEUE_NAME': 'test-queue',
    'QUEUE_CONNECTION': {
        'host': 'localhost',
        'port': 6379,
    },
    'THREADS': 4,
}

Running the consumer

Here is a screenshot of a couple terminals showing the consumer. In the left-hand terminal, I have an interactive shell in which I'm calling the "count some beans" functions. In the top-right I am running the consumer, which simply runs in the foreground, listens for new jobs, and executes them (optionally storing their results for retrieval). The bottom right is monitoring the local redis database, showing the calls to "BRPOP" (get a message), "LPUSH" (add a message), and "HXXX" (store/retrieve the results of a task):

Huey consumer

If you're using django, you can simply call the management command django-admin.py run_huey.

Interested? There are docs!

If you'd like to read more, I've written up some documentation, which has step-by-step instructions for getting started and provides a comprehensive overview of the API. I've been running the consumer for several months now on a couple of sites and have been very pleased with it! I hope you find it useful, too.

Comments (0)


Commenting has been closed.