File: basics.md

package info (click to toggle)
python-django-pgtrigger 4.15.3-1
  • links: PTS, VCS
  • area: main
  • in suites: sid
  • size: 956 kB
  • sloc: python: 4,412; makefile: 114; sh: 8; sql: 2
file content (97 lines) | stat: -rw-r--r-- 5,620 bytes parent folder | download | duplicates (3)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
# Basics

## The anatomy of a trigger

Postgres triggers are database functions written in PL/pgSQL that execute based on events
and conditions.

The [pgtrigger.Trigger][] object is the base class for all triggers in `django-pgtrigger`. Its attributes mirror the syntax required for [making a Postgres trigger](https://www.postgresql.org/docs/current/sql-createtrigger.html). Here are the most common attributes you'll use:

* **name**

    The identifying name of trigger. Is unique for every model and must be less than 48 characters.

* **operation**

    The table operation that fires a trigger. Operations are [pgtrigger.Update][], [pgtrigger.Insert][], [pgtrigger.Delete][], [pgtrigger.Truncate][], or [pgtrigger.UpdateOf][]. They can be `OR`ed together (e.g.  `pgtrigger.Insert | pgtrigger.Update`) to configure triggers on a combination of operations.

    !!! note

        [pgtrigger.UpdateOf][] fires when columns appear in an `UPDATE` statement. It will not fire if other triggers update the columns. See the notes in the [Postgres docs](https://www.postgresql.org/docs/12/sql-createtrigger.html) for more information.

* **when**

    When the trigger should run in relation to the operation. [pgtrigger.Before][] executes the trigger before the operation, and vice versa for [pgtrigger.After][]. [pgtrigger.InsteadOf][] is used for SQL views.

    !!! note

        [pgtrigger.Before][] and [pgtrigger.After][] can be used on SQL views under some circumstances. See [the Postgres docs](https://www.postgresql.org/docs/12/sql-createtrigger.html) for a breakdown.

* **condition** *(optional)*

    Conditionally execute the trigger based on the `OLD` or `NEW` rows.

    Use [pgtrigger.Q][] and [pgtrigger.F][] objects for constructing `WHERE` clauses with the `OLD` and `NEW` rows. See [the conditional triggers section](conditional.md) for more details and other utilities.

    !!! note

        Be sure to familiarize yourself with `OLD` and `NEW` rows when writing conditions by consulting the [Postgres docs](https://www.postgresql.org/docs/current/plpgsql-trigger.html). For example, `OLD` is always `NULL` in [pgtrigger.Insert][] triggers.

Here are attributes you'll need when writing more complex triggers.

* **func**

    The raw PL/pgSQL function that is executed.

    !!! note

        This is *not* the entire declared trigger function, but rather the snippet of PL/pgSQL that is nested in the `DECLARE ... BEGIN ... END` portion of the trigger.

* **declare** *(optional)*

    Define additional variable declarations as a list of `(variable_name, variable_type)` tuples. For example `declare=[('my_var_1', 'BOOLEAN'), ('my_var_2', 'JSONB')]`.

* **level** *(optional, default=pgtrigger.Row)*

    Configures the trigger to fire once for every row ([pgtrigger.Row][]) or once for every statement ([pgtrigger.Statement][]).

* **referencing** *(optional)*

    References the `OLD` and `NEW` rows as transition tables in statement-level triggers. For example, `pgtrigger.Referencing(old='old_table_name', new='new_table_name')` will make an `old_table_name` and `new_table_name` table available as transition tables. See [this StackExchange answer](https://dba.stackexchange.com/a/177468) for additional details, and see the [Cookbook](cookbook.md) for an example.

* **timing** *(optional)*

    Create a deferrable `CONSTRAINT` trigger when set. Use [pgtrigger.Immediate][] to execute the trigger at the end of a statement and [pgtrigger.Deferred][] to execute it at the end of a transaction.

    !!! note

        Deferrable triggers must have the `level` set to [pgtrigger.Row][] and `when` set to [pgtrigger.After][].

## Defining and installing triggers

Triggers are defined in the `triggers` attribute of the model `Meta` class. For example, this trigger protects the model from being deleted:

```python
from django.db import models
import pgtrigger


class CannotDelete(models.Model):
    class Meta:
        triggers = [
            pgtrigger.Protect(name='protect_deletes', operation=pgtrigger.Delete)
        ]
```

Triggers are installed by first running `python manage.py makemigrations` and then `python manage.py migrate`.

If you'd like to install a trigger on a model of a third-party app, see the [Advanced Installation](advanced_installation.md) section. This section also covers how you can manually install, enable, and disable triggers globally.

<a id="advantages_of_triggers"></a>
## The advantages over signals and model methods

There are three key advantages to using triggers over implementing the logic in a [Django signal handler](https://docs.djangoproject.com/en/4.1/topics/signals/) or by overriding model methods:

1. **Reliability**: Unlike Python code, triggers run alongside queries in the database, ensuring that nothing falls through the cracks. On the other hand, signals and model methods can provide a false sense of security. For example, signals aren't fired for `bulk_create`, and custom model methods aren't called in data migrations by default. Third party apps that bypass the ORM will also not work reliably.
2. **Complexity**: Complexity can balloon when trying to override models, managers, or querysets to accomplish the same logic a trigger can support. Even simple routines such as conditionally running code based on a changed field are difficult to implement correctly and prone to race conditions.
3. **Performance**: Triggers can perform SQL queries without needing to do expensive round trips to the database to fetch data. This can be a major performance enhancement for routines like history tracking or data denormalization.