1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254
|
.. _deploy:
Deploying Falcon on Linux with NGINX and uWSGI
==============================================
NGINX is a powerful web server and reverse proxy and uWSGI is a fast and
highly-configurable WSGI application server. Together, NGINX and uWSGI create a
one-two punch of speed and functionality which will suffice for most
applications. In addition, this stack provides the building blocks for a
horizontally-scalable and highly-available (HA) production environment and the
configuration below is just a starting point.
This guide provides instructions for deploying to a Linux environment only.
However, with a bit of effort you should be able to adapt this configuration to
other operating systems, such as OpenBSD.
Running your Application as a Different User
''''''''''''''''''''''''''''''''''''''''''''
It is best to execute the application as a different OS user than the one who
owns the source code for your application. The application user should *NOT*
have write access to your source. This mitigates the chance that someone could
write a malicious Python file to your source directory through an upload
endpoint you might define; when your application restarts, the malicious file is
loaded and proceeds to cause any number of Bad Things™ to happen.
.. code:: sh
$ useradd myproject --create-home
$ useradd myproject-runner --no-create-home
It is helpful to switch to the project user (myproject) and use the home
directory as the application environment.
If you are working on a remote server, switch to the myproject user and pull
down the source code for your application.
.. code:: sh
$ git clone git@github.com/myorg/myproject.git /home/myproject/src
.. note::
You could use a tarball, zip file, scp or any other means to get your source
onto a server.
Next, create a virtual environment which can be used to install your
dependencies.
.. code:: sh
$ python3 -m venv /home/myproject/venv
Then install your dependencies.
.. code:: sh
$ /home/myproject/venv/bin/pip install -r /home/myproject/src/requirements.txt
$ /home/myproject/venv/bin/pip install -e /home/myproject/src
$ /home/myproject/venv/bin/pip install uwsgi
.. note::
The exact commands for creating a virtual environment might differ based on
the Python version you are using and your operating system. At the end of the
day the application needs a virtualenv in /home/myproject/venv with the
project dependencies installed. Use the ``pip`` binary within the virtual
environment by ``source venv/bin/activate`` or using the full path.
Preparing your Application for Service
''''''''''''''''''''''''''''''''''''''
For the purposes of this tutorial, we'll assume that you have implemented
a way to configure your application, such as with a
``create_app()`` function or a module-level script. The role of this
function or script is to supply an instance of :any:`falcon.App`, which
implements the standard WSGI callable interface.
You will need to expose the :any:`falcon.App` instance in some way so that
uWSGI can find it. For this tutorial we recommend creating a ``wsgi.py`` file.
Modify the logic of the following example file to properly configure your
application. Ensure that you expose a variable called ``application`` which
is assigned to your :any:`falcon.App` instance.
.. code-block:: python
:caption: /home/myproject/src/wsgi.py
import os
import myproject
# Replace with your app's method of configuration
config = myproject.get_config(os.environ['MYPROJECT_CONFIG'])
# uWSGI will look for this variable
application = myproject.create_app(config)
Note that in the above example, the WSGI callable is simple assigned to a
variable, ``application``, rather than being passed to a self-hosting
WSGI server such as `wsgiref.simple_server.make_server`. Starting an
independent WSGI server in your ``wsgi.py`` file will render unexpected
results.
Deploying Falcon behind uWSGI
'''''''''''''''''''''''''''''
With your ``wsgi.py`` file in place, it is time to configure uWSGI. Start by
creating a simple ``uwsgi.ini`` file. In general, you shouldn't commit this
file to source control; it should be generated from a template by your
deployment toolchain according to the target environment (number of CPUs, etc.).
This configuration, when executed, will create a new uWSGI server backed by
your ``wsgi.py`` file and listening at ``127.0.0.1:8080``.
.. code-block:: ini
:caption: /home/myproject/src/uwsgi.ini
[uwsgi]
master = 1
vacuum = true
socket = 127.0.0.1:8080
enable-threads = true
thunder-lock = true
threads = 2
processes = 2
virtualenv = /home/myproject/venv
wsgi-file = /home/myproject/src/wsgi.py
chdir = /home/myproject/src
uid = myproject-runner
gid = myproject-runner
.. note::
**Threads vs. Processes**
There are many questions to consider when deciding how to manage the processes
that actually run your Python code. Are you generally CPU bound or IO bound?
Is your application thread-safe? How many CPU's do you have? What system are
you on? Do you need an in-process cache?
The configuration presented here enables both threads and processes. However,
you will have to experiment and do some research to understand your
application's unique requirements, and then tailor your uWSGI configuration
accordingly. Generally speaking, uWSGI is flexible enough to support most
types of applications.
.. note::
**TCP vs. UNIX Sockets**
NGINX and uWSGI can communicate via normal TCP (using an IP address) or UNIX
sockets (using a socket file). TCP sockets are easier to set up and generally
work for simple deployments. If you want to have finer control over which
processes, users, or groups may access the uWSGI application, or you are looking
for a bit of a speed boost, consider using UNIX sockets. uWSGI can automatically
drop privileges with ``chmod-socket`` and switch users with ``chown-socket``.
The ``uid`` and ``gid`` settings, as shown above, are critical to securing your
deployment. These values control the OS-level user and group the server
will use to execute the application. The specified OS user and group should not
have write permissions to the source directory. In this case, we use the
`myproject-runner` user that was created earlier for this purpose.
You can now start uWSGI like this:
.. code:: sh
$ /home/myproject/venv/bin/uwsgi -c uwsgi.ini
If everything goes well, you should see something like this:
::
*** Operational MODE: preforking+threaded ***
...
*** uWSGI is running in multiple interpreter mode ***
...
spawned uWSGI master process (pid: 91828)
spawned uWSGI worker 1 (pid: 91866, cores: 2)
spawned uWSGI worker 2 (pid: 91867, cores: 2)
.. note::
It is always a good idea to keep an eye on the uWSGI logs, as they will contain
exceptions and other information from your application that can help shed some
light on unexpected behaviors.
Connecting NGINX and uWSGI
''''''''''''''''''''''''''
Although uWSGI may serve HTTP requests directly, it can be helpful to use a reverse
proxy, such as NGINX, to offload TLS negotiation, static file serving, etc.
NGINX natively supports `the uwsgi protocol <https://uwsgi-docs.readthedocs.io/en/latest/Protocol.html>`_, for efficiently proxying requests to uWSGI. In
NGINX parlance, we will create an "upstream" and direct that upstream (via a TCP
socket) to our now-running uWSGI application.
Before proceeding, install NGINX according to `the instructions for your
platform <https://docs.nginx.com/nginx/admin-guide/installing-nginx/installing-nginx-open-source/>`_.
Then, create an NGINX conf file that looks something like this:
.. code-block:: ini
:caption: /etc/nginx/sites-available/myproject.conf
server {
listen 80;
server_name myproject.com;
access_log /var/log/nginx/myproject-access.log;
error_log /var/log/nginx/myproject-error.log warn;
location / {
uwsgi_pass 127.0.0.1:8080
include uwsgi_params;
}
}
Finally, start (or restart) NGINX:
.. code-block:: sh
$ sudo service start nginx
You should now have a working application. Check your uWSGI and NGINX logs for
errors if the application does not start.
Further Considerations
''''''''''''''''''''''
We did not explain how to configure TLS (HTTPS) for NGINX, leaving that as an
exercise for the reader. However, we do recommend using Let's Encrypt, which offers free,
short-term certificates with auto-renewal. Visit the `Let’s Encrypt site`_ to learn
how to integrate their service directly with NGINX.
In addition to setting up NGINX and uWSGI to run your application, you will of
course need to deploy a database server or any other services required by your
application. Due to the wide variety of options and considerations in this
space, we have chosen not to include ancillary services in this guide. However,
the Falcon community is always happy to help with deployment questions, so
`please don't hesitate to ask <https://falcon.readthedocs.io/en/stable/community/help.html#chat>`_.
.. _`Let’s Encrypt site`: https://certbot.eff.org/
|