File: application.rst

package info (click to toggle)
celery 5.5.3-1
  • links: PTS, VCS
  • area: main
  • in suites: forky, sid
  • size: 8,008 kB
  • sloc: python: 64,346; sh: 795; makefile: 378
file content (556 lines) | stat: -rw-r--r-- 14,779 bytes parent folder | download | duplicates (2)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
.. _guide-app:

=============
 Application
=============

.. contents::
    :local:
    :depth: 1

The Celery library must be instantiated before use, this instance
is called an application (or *app* for short).

The application is thread-safe so that multiple Celery applications
with different configurations, components, and tasks can co-exist in the
same process space.

Let's create one now:

.. code-block:: pycon

    >>> from celery import Celery
    >>> app = Celery()
    >>> app
    <Celery __main__:0x100469fd0>

The last line shows the textual representation of the application:
including the name of the app class (``Celery``), the name of the
current main module (``__main__``), and the memory address of the object
(``0x100469fd0``).

Main Name
=========

Only one of these is important, and that's the main module name.
Let's look at why that is.

When you send a task message in Celery, that message won't contain
any source code, but only the name of the task you want to execute.
This works similarly to how host names work on the internet: every worker
maintains a mapping of task names to their actual functions, called the *task
registry*.

Whenever you define a task, that task will also be added to the local registry:

.. code-block:: pycon

    >>> @app.task
    ... def add(x, y):
    ...     return x + y

    >>> add
    <@task: __main__.add>

    >>> add.name
    __main__.add

    >>> app.tasks['__main__.add']
    <@task: __main__.add>

and there you see that ``__main__`` again; whenever Celery isn't able
to detect what module the function belongs to, it uses the main module
name to generate the beginning of the task name.

This is only a problem in a limited set of use cases:

    #. If the module that the task is defined in is run as a program.
    #. If the application is created in the Python shell (REPL).

For example here, where the tasks module is also used to start a worker
with :meth:`@worker_main`:

:file:`tasks.py`:

.. code-block:: python

    from celery import Celery
    app = Celery()

    @app.task
    def add(x, y): return x + y

    if __name__ == '__main__':
        args = ['worker', '--loglevel=INFO']
        app.worker_main(argv=args)

When this module is executed the tasks will be named starting with "``__main__``",
but when the module is imported by another process, say to call a task,
the tasks will be named starting with "``tasks``" (the real name of the module):

.. code-block:: pycon

    >>> from tasks import add
    >>> add.name
    tasks.add

You can specify another name for the main module:

.. code-block:: pycon

    >>> app = Celery('tasks')
    >>> app.main
    'tasks'

    >>> @app.task
    ... def add(x, y):
    ...     return x + y

    >>> add.name
    tasks.add

.. seealso:: :ref:`task-names`

Configuration
=============

There are several options you can set that'll change how
Celery works. These options can be set directly on the app instance,
or you can use a dedicated configuration module.

The configuration is available as :attr:`@conf`:

.. code-block:: pycon

    >>> app.conf.timezone
    'Europe/London'

where you can also set configuration values directly:

.. code-block:: pycon

    >>> app.conf.enable_utc = True

or update several keys at once by using the ``update`` method:

.. code-block:: python

    >>> app.conf.update(
    ...     enable_utc=True,
    ...     timezone='Europe/London',
    ...)

The configuration object consists of multiple dictionaries
that are consulted in order:

    #. Changes made at run-time.
    #. The configuration module (if any)
    #. The default configuration (:mod:`celery.app.defaults`).

You can even add new default sources by using the :meth:`@add_defaults`
method.

.. seealso::

    Go to the :ref:`Configuration reference <configuration>` for a complete
    listing of all the available settings, and their default values.

``config_from_object``
----------------------

The :meth:`@config_from_object` method loads configuration
from a configuration object.

This can be a configuration module, or any object with configuration attributes.

Note that any configuration that was previously set will be reset when
:meth:`~@config_from_object` is called. If you want to set additional
configuration you should do so after.

Example 1: Using the name of a module
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

The :meth:`@config_from_object` method can take the fully qualified
name of a Python module, or even the name of a Python attribute,
for example: ``"celeryconfig"``, ``"myproj.config.celery"``, or
``"myproj.config:CeleryConfig"``:

.. code-block:: python

    from celery import Celery

    app = Celery()
    app.config_from_object('celeryconfig')

The ``celeryconfig`` module may then look like this:

:file:`celeryconfig.py`:

.. code-block:: python

    enable_utc = True
    timezone = 'Europe/London'

and the app will be able to use it as long as ``import celeryconfig`` is
possible.

Example 2: Passing an actual module object
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

You can also pass an already imported module object, but this
isn't always recommended.

.. tip::

    Using the name of a module is recommended as this means the module does
    not need to be serialized when the prefork pool is used. If you're
    experiencing configuration problems or pickle errors then please
    try using the name of a module instead.

.. code-block:: python

    import celeryconfig

    from celery import Celery

    app = Celery()
    app.config_from_object(celeryconfig)


Example 3:  Using a configuration class/object
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. code-block:: python

    from celery import Celery

    app = Celery()

    class Config:
        enable_utc = True
        timezone = 'Europe/London'

    app.config_from_object(Config)
    # or using the fully qualified name of the object:
    #   app.config_from_object('module:Config')

``config_from_envvar``
----------------------

The :meth:`@config_from_envvar` takes the configuration module name
from an environment variable

For example -- to load configuration from a module specified in the
environment variable named :envvar:`CELERY_CONFIG_MODULE`:

.. code-block:: python

    import os
    from celery import Celery

    #: Set default configuration module name
    os.environ.setdefault('CELERY_CONFIG_MODULE', 'celeryconfig')

    app = Celery()
    app.config_from_envvar('CELERY_CONFIG_MODULE')

You can then specify the configuration module to use via the environment:

.. code-block:: console

    $ CELERY_CONFIG_MODULE="celeryconfig.prod" celery worker -l INFO

.. _app-censored-config:

Censored configuration
----------------------

If you ever want to print out the configuration, as debugging information
or similar, you may also want to filter out sensitive information like
passwords and API keys.

Celery comes with several utilities useful for presenting the configuration,
one is :meth:`~celery.app.utils.Settings.humanize`:

.. code-block:: pycon

    >>> app.conf.humanize(with_defaults=False, censored=True)

This method returns the configuration as a tabulated string. This will
only contain changes to the configuration by default, but you can include the
built-in default keys and values by enabling the ``with_defaults`` argument.

If you instead want to work with the configuration as a dictionary, you
can use the :meth:`~celery.app.utils.Settings.table` method:

.. code-block:: pycon

    >>> app.conf.table(with_defaults=False, censored=True)

Please note that Celery won't be able to remove all sensitive information,
as it merely uses a regular expression to search for commonly named keys.
If you add custom settings containing sensitive information you should name
the keys using a name that Celery identifies as secret.

A configuration setting will be censored if the name contains any of
these sub-strings:

``API``, ``TOKEN``, ``KEY``, ``SECRET``, ``PASS``, ``SIGNATURE``, ``DATABASE``

Laziness
========

The application instance is lazy, meaning it won't be evaluated
until it's actually needed.

Creating a :class:`@Celery` instance will only do the following:

    #. Create a logical clock instance, used for events.
    #. Create the task registry.
    #. Set itself as the current app (but not if the ``set_as_current``
       argument was disabled)
    #. Call the :meth:`@on_init` callback (does nothing by default).

The :meth:`@task` decorators don't create the tasks at the point when
the task is defined, instead it'll defer the creation
of the task to happen either when the task is used, or after the
application has been *finalized*,

This example shows how the task isn't created until
you use the task, or access an attribute (in this case :meth:`repr`):

.. code-block:: pycon

    >>> @app.task
    >>> def add(x, y):
    ...    return x + y

    >>> type(add)
    <class 'celery.local.PromiseProxy'>

    >>> add.__evaluated__()
    False

    >>> add        # <-- causes repr(add) to happen
    <@task: __main__.add>

    >>> add.__evaluated__()
    True

*Finalization* of the app happens either explicitly by calling
:meth:`@finalize` -- or implicitly by accessing the :attr:`@tasks`
attribute.

Finalizing the object will:

    #. Copy tasks that must be shared between apps

        Tasks are shared by default, but if the
        ``shared`` argument to the task decorator is disabled,
        then the task will be private to the app it's bound to.

    #. Evaluate all pending task decorators.

    #. Make sure all tasks are bound to the current app.

        Tasks are bound to an app so that they can read default
        values from the configuration.

.. _default-app:

.. topic:: The "default app"

    Celery didn't always have applications, it used to be that
    there was only a module-based API. A compatibility API was
    available at the old location until the release of Celery 5.0,
    but has been removed.

    Celery always creates a special app - the "default app",
    and this is used if no custom application has been instantiated.

    The :mod:`celery.task` module is no longer available. Use the
    methods on the app instance, not the module based API:

    .. code-block:: python

        from celery.task import Task   # << OLD Task base class.

        from celery import Task        # << NEW base class.


Breaking the chain
==================

While it's possible to depend on the current app
being set, the best practice is to always pass the app instance
around to anything that needs it.

I call this the "app chain", since it creates a chain
of instances depending on the app being passed.

The following example is considered bad practice:

.. code-block:: python

    from celery import current_app

    class Scheduler:

        def run(self):
            app = current_app

Instead it should take the ``app`` as an argument:

.. code-block:: python

    class Scheduler:

        def __init__(self, app):
            self.app = app

Internally Celery uses the :func:`celery.app.app_or_default` function
so that everything also works in the module-based compatibility API

.. code-block:: python

    from celery.app import app_or_default

    class Scheduler:
        def __init__(self, app=None):
            self.app = app_or_default(app)

In development you can set the :envvar:`CELERY_TRACE_APP`
environment variable to raise an exception if the app
chain breaks:

.. code-block:: console

    $ CELERY_TRACE_APP=1 celery worker -l INFO


.. topic:: Evolving the API

    Celery has changed a lot from 2009 since it was initially
    created.

    For example, in the beginning it was possible to use any callable as
    a task:

    .. code-block:: pycon

        def hello(to):
            return 'hello {0}'.format(to)

        >>> from celery.execute import apply_async

        >>> apply_async(hello, ('world!',))

    or you could also create a ``Task`` class to set
    certain options, or override other behavior

    .. code-block:: python

        from celery import Task
        from celery.registry import tasks

        class Hello(Task):
            queue = 'hipri'

            def run(self, to):
                return 'hello {0}'.format(to)
        tasks.register(Hello)

        >>> Hello.delay('world!')

    Later, it was decided that passing arbitrary call-able's
    was an anti-pattern, since it makes it very hard to use
    serializers other than pickle, and the feature was removed
    in 2.0, replaced by task decorators:

    .. code-block:: python

        from celery import app

        @app.task(queue='hipri')
        def hello(to):
            return 'hello {0}'.format(to)

Abstract Tasks
==============

All tasks created using the :meth:`@task` decorator
will inherit from the application's base :attr:`~@Task` class.

You can specify a different base class using the ``base`` argument:

.. code-block:: python

    @app.task(base=OtherTask):
    def add(x, y):
        return x + y

To create a custom task class you should inherit from the neutral base
class: :class:`celery.Task`.

.. code-block:: python

    from celery import Task

    class DebugTask(Task):

        def __call__(self, *args, **kwargs):
            print('TASK STARTING: {0.name}[{0.request.id}]'.format(self))
            return self.run(*args, **kwargs)


.. tip::

    If you override the task's ``__call__`` method, then it's very important
    that you also call ``self.run`` to execute the body of the task.  Do not
    call ``super().__call__``.  The ``__call__`` method of the neutral base
    class :class:`celery.Task` is only present for reference.  For optimization,
    this has been unrolled into ``celery.app.trace.build_tracer.trace_task``
    which calls ``run`` directly on the custom task class if no ``__call__``
    method is defined.

The neutral base class is special because it's not bound to any specific app
yet. Once a task is bound to an app it'll read configuration to set default
values, and so on.

To realize a base class you need to create a task using the :meth:`@task`
decorator:

.. code-block:: python

    @app.task(base=DebugTask)
    def add(x, y):
        return x + y

It's even possible to change the default base class for an application
by changing its :meth:`@Task` attribute:

.. code-block:: pycon

    >>> from celery import Celery, Task

    >>> app = Celery()

    >>> class MyBaseTask(Task):
    ...    queue = 'hipri'

    >>> app.Task = MyBaseTask
    >>> app.Task
    <unbound MyBaseTask>

    >>> @app.task
    ... def add(x, y):
    ...     return x + y

    >>> add
    <@task: __main__.add>

    >>> add.__class__.mro()
    [<class add of <Celery __main__:0x1012b4410>>,
     <unbound MyBaseTask>,
     <unbound Task>,
     <type 'object'>]