1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527
|
# Django-RQ
[](https://github.com/rq/django-rq/actions/workflows/test.yml)
Django integration with [RQ](https://github.com/nvie/rq), a [Redis](http://redis.io/) based Python queuing library. [Django-RQ](https://github.com/rq/django-rq) is a simple app that allows you to configure your queues in Django's `settings.py` and easily use them in your project.
## Support Django-RQ
If you find `django-rq` useful, please consider supporting its development via [Tidelift](https://tidelift.com/subscription/pkg/pypi-django_rq?utm_source=pypi-django-rq&utm_medium=referral&utm_campaign=readme).
## Requirements
- [Django](https://www.djangoproject.com/) (3.2+)
- [RQ](https://github.com/nvie/rq)
## Installation
- Install `django-rq` (or [download from PyPI](http://pypi.python.org/pypi/django-rq)):
```bash
pip install django-rq
```
- Add `django_rq` to `INSTALLED_APPS` in `settings.py`:
```python
INSTALLED_APPS = (
# other apps
"django_rq",
)
```
- Configure your queues in Django's `settings.py`:
```python
RQ_QUEUES = {
'default': {
'HOST': 'localhost',
'PORT': 6379,
'DB': 0,
'USERNAME': 'some-user',
'PASSWORD': 'some-password',
'DEFAULT_TIMEOUT': 360,
'DEFAULT_RESULT_TTL': 800,
'REDIS_CLIENT_KWARGS': { # Eventual additional Redis connection arguments
'ssl_cert_reqs': None,
},
},
'with-sentinel': {
'SENTINELS': [('localhost', 26736), ('localhost', 26737)],
'MASTER_NAME': 'redismaster',
'DB': 0,
# Redis username/password
'USERNAME': 'redis-user',
'PASSWORD': 'secret',
'SOCKET_TIMEOUT': 0.3,
'CONNECTION_KWARGS': { # Eventual additional Redis connection arguments
'ssl': True
},
'SENTINEL_KWARGS': { # Eventual Sentinel connection arguments
# If Sentinel also has auth, username/password can be passed here
'username': 'sentinel-user',
'password': 'secret',
},
},
'high': {
'URL': os.getenv('REDISTOGO_URL', 'redis://localhost:6379/0'), # If you're on Heroku
'DEFAULT_TIMEOUT': 500,
},
'low': {
'HOST': 'localhost',
'PORT': 6379,
'DB': 0,
}
}
RQ_EXCEPTION_HANDLERS = ['path.to.my.handler'] # If you need custom exception handlers
```
- Include `django_rq.urls` in your `urls.py`:
```python
urlpatterns += [
path('django-rq/', include('django_rq.urls'))
]
```
## Usage
### Putting jobs in the queue
Django-RQ allows you to easily put jobs into any of the queues defined in `settings.py`. It comes with a few utility functions:
- `enqueue` - push a job to the `default` queue:
```python
import django_rq
django_rq.enqueue(func, foo, bar=baz)
```
- `get_queue` - returns a `Queue` instance.
```python
import django_rq
queue = django_rq.get_queue('high')
queue.enqueue(func, foo, bar=baz)
```
In addition to `name` argument, `get_queue` also accepts `default_timeout`, `is_async`, `autocommit`, `connection` and `queue_class` arguments. For example:
```python
queue = django_rq.get_queue('default', autocommit=True, is_async=True, default_timeout=360)
queue.enqueue(func, foo, bar=baz)
```
You can provide your own singleton Redis connection object to this function so that it will not create a new connection object for each queue definition. This will help you limit number of connections to Redis server. For example:
```python
import django_rq
import redis
redis_cursor = redis.StrictRedis(host='', port='', db='', password='')
high_queue = django_rq.get_queue('high', connection=redis_cursor)
low_queue = django_rq.get_queue('low', connection=redis_cursor)
```
- `get_connection` - accepts a single queue name argument (defaults to "default") and returns a connection to the queue's Redis server:
```python
import django_rq
redis_conn = django_rq.get_connection('high')
```
- `get_worker` - accepts optional queue names and returns a new RQ `Worker` instance for specified queues (or `default` queue):
```python
import django_rq
worker = django_rq.get_worker() # Returns a worker for "default" queue
worker.work()
worker = django_rq.get_worker('low', 'high') # Returns a worker for "low" and "high"
```
### `@job` decorator
To easily turn a callable into an RQ task, you can also use the `@job` decorator that comes with `django_rq`:
```python
from django_rq import job
@job
def long_running_func():
pass
long_running_func.delay() # Enqueue function in "default" queue
@job('high')
def long_running_func():
pass
long_running_func.delay() # Enqueue function in "high" queue
```
You can pass in any arguments that RQ's job decorator accepts:
```python
@job('default', timeout=3600)
def long_running_func():
pass
long_running_func.delay() # Enqueue function with a timeout of 3600 seconds.
```
It's possible to specify default for `result_ttl` decorator keyword argument via `DEFAULT_RESULT_TTL` setting:
```python
RQ = {
'DEFAULT_RESULT_TTL': 5000,
}
```
With this setting, job decorator will set `result_ttl` to 5000 unless it's specified explicitly or included in the queue config.
### Running workers
django_rq provides a management command that starts a worker for every queue specified as arguments:
```bash
python manage.py rqworker high default low
```
If you want to run `rqworker` in burst mode, you can pass in the `--burst` flag:
```bash
python manage.py rqworker high default low --burst
```
If you need to use custom worker, job or queue classes, it is best to use global settings (see [Custom queue classes](#custom-queue-classes) and [Custom job and worker classes](#custom-job-and-worker-classes)). However, it is also possible to override such settings with command line options as follows.
To use a custom worker class, you can pass in the `--worker-class` flag with the path to your worker:
```bash
python manage.py rqworker high default low --worker-class 'path.to.GeventWorker'
```
To use a custom queue class, you can pass in the `--queue-class` flag with the path to your queue class:
```bash
python manage.py rqworker high default low --queue-class 'path.to.CustomQueue'
```
To use a custom job class, provide the `--job-class` flag.
Starting from version 2.10, running RQ's worker-pool is also supported:
```bash
python manage.py rqworker-pool default low medium --num-workers 4
```
### Support for Scheduled Jobs
With RQ 1.2.0 you can use the [built-in scheduler](https://python-rq.org/docs/scheduling/) for your jobs. For example:
```python
from datetime import datetime
from django_rq.queues import get_queue
queue = get_queue('default')
job = queue.enqueue_at(datetime(2020, 10, 10), func)
```
If you are using built-in scheduler you have to start workers with scheduler support:
```bash
python manage.py rqworker --with-scheduler
```
### Support for RQ's CronScheduler
Create a cron configuration file:
```python
# cron_config.py
from rq import cron
from myapp.tasks import send_report, sync_data
cron.register(send_report, queue_name='default', cron='0 9 * * *') # Daily at 9:00 AM
cron.register(sync_data, queue_name='high', interval=30) # Every 30 seconds
```
Then start the cron scheduler:
```bash
python manage.py rqcron cron_config.py
```
For more options, visit [RQ's CronScheduler documentation](https://python-rq.org/docs/cron/).
### Support for django-redis and django-redis-cache
If you have [django-redis](https://django-redis.readthedocs.org/) or [django-redis-cache](https://github.com/sebleier/django-redis-cache/) installed, you can instruct django_rq to use the same connection information from your Redis cache. This has two advantages: it's DRY and it takes advantage of any optimization that may be going on in your cache setup (like using connection pooling or [Hiredis](https://github.com/redis/hiredis)).
To configure it, use a dict with the key `USE_REDIS_CACHE` pointing to the name of the desired cache in your `RQ_QUEUES` dict. It goes without saying that the chosen cache must exist and use the Redis backend. See your respective Redis cache package docs for configuration instructions. It's also important to point out that since the django-redis-cache `ShardedClient` splits the cache over multiple Redis connections, it does not work.
Here is an example settings fragment for `django-redis`:
```python
CACHES = {
'redis-cache': {
'BACKEND': 'redis_cache.cache.RedisCache',
'LOCATION': 'localhost:6379:1',
'OPTIONS': {
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
'MAX_ENTRIES': 5000,
},
},
}
RQ_QUEUES = {
'high': {
'USE_REDIS_CACHE': 'redis-cache',
},
'low': {
'USE_REDIS_CACHE': 'redis-cache',
},
}
```
### Suspending and Resuming Workers
Sometimes you may want to suspend RQ to prevent it from processing new jobs. A classic example is during the initial phase of a deployment script or in advance of putting your site into maintenance mode. This is particularly helpful when you have jobs that are relatively long-running and might otherwise be forcibly killed during the deploy.
The `suspend` command stops workers on _all_ queues (in a single Redis database) from picking up new jobs. However currently running jobs will continue until completion.
```bash
# Suspend indefinitely
python manage.py rqsuspend
# Suspend for a specific duration (in seconds) then automatically
# resume work again.
python manage.py rqsuspend -d 600
# Resume work again.
python manage.py rqresume
```
### Queue Statistics
`django_rq` also provides a dashboard to monitor the status of your queues at `/django-rq/` (or whatever URL you set in your `urls.py` during installation).
You can also add a link to this dashboard in `/admin` by adding `RQ_SHOW_ADMIN_LINK = True` in `settings.py`. Be careful though, this will override the default admin template so it may interfere with other apps that modify the default admin template.
These statistics are also available in JSON format via `/django-rq/stats.json`, which is accessible to staff members. If you need to access this view via other HTTP clients (for monitoring purposes), you can define `RQ_API_TOKEN`. Then, include the token in the Authorization header as a Bearer token: `Authorization: Bearer <token>` and access it via `/django-rq/stats.json`.

Note: Statistics of scheduled jobs display jobs from [RQ built-in scheduler](https://python-rq.org/docs/scheduling/), not optional [RQ scheduler](https://github.com/rq/rq-scheduler).
Additionally, these statistics are also accessible from the command line.
```bash
python manage.py rqstats
python manage.py rqstats --interval=1 # Refreshes every second
python manage.py rqstats --json # Output as JSON
python manage.py rqstats --yaml # Output as YAML
```

### Configuring Prometheus
`django_rq` also provides a Prometheus compatible view, which can be enabled by installing `prometheus_client` or installing the extra "prometheus-metrics" (`pip install django-rq[prometheus]`). The metrics are exposed at `/django-rq/metrics/` and the following is an example of the metrics that are exported:
```text
# HELP rq_workers RQ workers
# TYPE rq_workers gauge
# HELP rq_job_successful_total RQ successful job count
# TYPE rq_job_successful_total counter
# HELP rq_job_failed_total RQ failed job count
# TYPE rq_job_failed_total counter
# HELP rq_working_seconds_total RQ total working time
# TYPE rq_working_seconds_total counter
# HELP rq_jobs RQ jobs by status
# TYPE rq_jobs gauge
rq_jobs{queue="default",status="queued"} 0.0
rq_jobs{queue="default",status="started"} 0.0
rq_jobs{queue="default",status="finished"} 0.0
rq_jobs{queue="default",status="failed"} 0.0
rq_jobs{queue="default",status="deferred"} 0.0
rq_jobs{queue="default",status="scheduled"} 0.0
```
If you need to access this view via other HTTP clients (for monitoring purposes), you can define `RQ_API_TOKEN`. Then, include the token in the Authorization header as a Bearer token: `Authorization: Bearer <token>` and access it via `/django-rq/metrics`.
### Configuring Logging
RQ uses Python's `logging`, this means you can easily configure `rqworker`'s logging mechanism in Django's `settings.py`. For example:
```python
LOGGING = {
"version": 1,
"disable_existing_loggers": False,
"formatters": {
"rq_console": {
"format": "%(asctime)s %(message)s",
"datefmt": "%H:%M:%S",
},
},
"handlers": {
"rq_console": {
"level": "DEBUG",
"class": "rq.logutils.ColorizingStreamHandler",
"formatter": "rq_console",
"exclude": ["%(asctime)s"],
},
},
'loggers': {
"rq.worker": {
"handlers": ["rq_console", "sentry"],
"level": "DEBUG"
},
}
}
```
### Custom Queue Classes
By default, every queue will use `DjangoRQ` class. If you want to use a custom queue class, you can do so by adding a `QUEUE_CLASS` option on a per queue basis in `RQ_QUEUES`:
```python
RQ_QUEUES = {
'default': {
'HOST': 'localhost',
'PORT': 6379,
'DB': 0,
'QUEUE_CLASS': 'module.path.CustomClass',
}
}
```
Or you can specify `DjangoRQ` to use a custom class for all your queues in `RQ` settings:
```python
RQ = {
'QUEUE_CLASS': 'module.path.CustomClass',
}
```
Custom queue classes should inherit from `django_rq.queues.DjangoRQ`.
If you are using more than one queue class (not recommended), be sure to only run workers on queues with same queue class. For example if you have two queues defined in `RQ_QUEUES` and one has custom class specified, you would have to run at least two separate workers for each queue.
### Custom Job and Worker Classes
Similarly to custom queue classes, global custom job and worker classes can be configured using `JOB_CLASS` and `WORKER_CLASS` settings:
```python
RQ = {
'JOB_CLASS': 'module.path.CustomJobClass',
'WORKER_CLASS': 'module.path.CustomWorkerClass',
}
```
Custom job class should inherit from `rq.job.Job`. It will be used for all jobs if configured.
Custom worker class should inherit from `rq.worker.Worker`. It will be used for running all workers unless overridden by `rqworker` management command `worker-class` option.
### Testing Tip
For an easier testing process, you can run a worker synchronously this way:
```python
from django.test import TestCase
from django_rq import get_worker
class MyTest(TestCase):
def test_something_that_creates_jobs(self):
... # Stuff that init jobs.
get_worker().work(burst=True) # Processes all jobs then stop.
... # Asserts that the job stuff is done.
```
### Synchronous Mode
You can set the option `ASYNC` to `False` to make synchronous operation the default for a given queue. This will cause jobs to execute immediately and on the same thread as they are dispatched, which is useful for testing and debugging. For example, you might add the following after your queue configuration in your settings file:
```python
# ... Logic to set DEBUG and TESTING settings to True or False ...
# ... Regular RQ_QUEUES setup code ...
if DEBUG or TESTING:
for queue_config in RQ_QUEUES.values():
queue_config['ASYNC'] = False
```
Note that setting the `is_async` parameter explicitly when calling `get_queue` will override this setting.
## Running Tests
To run `django_rq`'s test suite (you'll need `pytest-django`):
```bash
pytest
```
## Deploying on Ubuntu
Create an rqworker service that runs the high, default, and low queues.
```bash
sudo vi /etc/systemd/system/rqworker.service
```
```bash
[Unit]
Description=Django-RQ Worker
After=network.target
[Service]
WorkingDirectory=<<path_to_your_project_folder>>
ExecStart=/home/ubuntu/.virtualenv/<<your_virtualenv>>/bin/python \
<<path_to_your_project_folder>>/manage.py \
rqworker high default low
[Install]
WantedBy=multi-user.target
```
Enable and start the service:
```bash
sudo systemctl enable rqworker
sudo systemctl start rqworker
```
## Deploying on Heroku
Add `django-rq` to your `requirements.txt` file with:
```bash
pip freeze > requirements.txt
```
Update your `Procfile` to:
```bash
web: gunicorn --pythonpath="$PWD/your_app_name" config.wsgi:application
worker: python your_app_name/manage.py rqworker high default low
```
Commit and re-deploy. Then add your new worker with:
```bash
heroku scale worker=1
```
## Changelog
See [CHANGELOG.md](https://github.com/rq/django-rq/blob/master/CHANGELOG.md).
---
Django-RQ is maintained by [Stamps](https://stamps.id), an Indonesian based company that provides enterprise grade CRM and order management systems.
|