1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387
|
:mod:`!concurrent.interpreters` --- Multiple interpreters in the same process
=============================================================================
.. module:: concurrent.interpreters
:synopsis: Multiple interpreters in the same process
.. moduleauthor:: Eric Snow <ericsnowcurrently@gmail.com>
.. sectionauthor:: Eric Snow <ericsnowcurrently@gmail.com>
.. versionadded:: 3.14
**Source code:** :source:`Lib/concurrent/interpreters`
--------------
The :mod:`!concurrent.interpreters` module constructs higher-level
interfaces on top of the lower level :mod:`!_interpreters` module.
The module is primarily meant to provide a basic API for managing
interpreters (AKA "subinterpreters") and running things in them.
Running mostly involves switching to an interpreter (in the current
thread) and calling a function in that execution context.
For concurrency, interpreters themselves (and this module) don't
provide much more than isolation, which on its own isn't useful.
Actual concurrency is available separately through
:mod:`threads <threading>` See `below <interp-concurrency_>`_
.. seealso::
:class:`~concurrent.futures.InterpreterPoolExecutor`
combines threads with interpreters in a familiar interface.
.. XXX Add references to the upcoming HOWTO docs in the seealso block.
:ref:`isolating-extensions-howto`
how to update an extension module to support multiple interpreters
:pep:`554`
:pep:`734`
:pep:`684`
.. XXX Why do we disallow multiple interpreters on WASM?
.. include:: ../includes/wasm-notavail.rst
Key details
-----------
Before we dive in further, there are a small number of details
to keep in mind about using multiple interpreters:
* `isolated <interp-isolation_>`_, by default
* no implicit threads
* not all PyPI packages support use in multiple interpreters yet
.. XXX Are there other relevant details to list?
.. _interpreters-intro:
Introduction
------------
An "interpreter" is effectively the execution context of the Python
runtime. It contains all of the state the runtime needs to execute
a program. This includes things like the import state and builtins.
(Each thread, even if there's only the main thread, has some extra
runtime state, in addition to the current interpreter, related to
the current exception and the bytecode eval loop.)
The concept and functionality of the interpreter have been a part of
Python since version 2.2, but the feature was only available through
the C-API and not well known, and the `isolation <interp-isolation_>`_
was relatively incomplete until version 3.12.
.. _interp-isolation:
Multiple Interpreters and Isolation
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
A Python implementation may support using multiple interpreters in the
same process. CPython has this support. Each interpreter is
effectively isolated from the others (with a limited number of
carefully managed process-global exceptions to the rule).
That isolation is primarily useful as a strong separation between
distinct logical components of a program, where you want to have
careful control of how those components interact.
.. note::
Interpreters in the same process can technically never be strictly
isolated from one another since there are few restrictions on memory
access within the same process. The Python runtime makes a best
effort at isolation but extension modules may easily violate that.
Therefore, do not use multiple interpreters in security-sensitive
situations, where they shouldn't have access to each other's data.
Running in an Interpreter
^^^^^^^^^^^^^^^^^^^^^^^^^
Running in a different interpreter involves switching to it in the
current thread and then calling some function. The runtime will
execute the function using the current interpreter's state. The
:mod:`!concurrent.interpreters` module provides a basic API for
creating and managing interpreters, as well as the switch-and-call
operation.
No other threads are automatically started for the operation.
There is `a helper <interp-call-in-thread_>`_ for that though.
There is another dedicated helper for calling the builtin
:func:`exec` in an interpreter.
When :func:`exec` (or :func:`eval`) are called in an interpreter,
they run using the interpreter's :mod:`!__main__` module as the
"globals" namespace. The same is true for functions that aren't
associated with any module. This is the same as how scripts invoked
from the command-line run in the :mod:`!__main__` module.
.. _interp-concurrency:
Concurrency and Parallelism
^^^^^^^^^^^^^^^^^^^^^^^^^^^
As noted earlier, interpreters do not provide any concurrency
on their own. They strictly represent the isolated execution
context the runtime will use *in the current thread*. That isolation
makes them similar to processes, but they still enjoy in-process
efficiency, like threads.
All that said, interpreters do naturally support certain flavors of
concurrency, as a powerful side effect of that isolation.
There's a powerful side effect of that isolation. It enables a
different approach to concurrency than you can take with async or
threads. It's a similar concurrency model to CSP or the actor model,
a model which is relatively easy to reason about.
You can take advantage of that concurrency model in a single thread,
switching back and forth between interpreters, Stackless-style.
However, this model is more useful when you combine interpreters
with multiple threads. This mostly involves starting a new thread,
where you switch to another interpreter and run what you want there.
Each actual thread in Python, even if you're only running in the main
thread, has its own *current* execution context. Multiple threads can
use the same interpreter or different ones.
At a high level, you can think of the combination of threads and
interpreters as threads with opt-in sharing.
As a significant bonus, interpreters are sufficiently isolated that
they do not share the :term:`GIL`, which means combining threads with
multiple interpreters enables full multi-core parallelism.
(This has been the case since Python 3.12.)
Communication Between Interpreters
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
In practice, multiple interpreters are useful only if we have a way
to communicate between them. This usually involves some form of
message passing, but can even mean sharing data in some carefully
managed way.
With this in mind, the :mod:`!concurrent.interpreters` module provides
a :class:`queue.Queue` implementation, available through
:func:`create_queue`.
.. _interp-object-sharing:
"Sharing" Objects
^^^^^^^^^^^^^^^^^
Any data actually shared between interpreters loses the thread-safety
provided by the :term:`GIL`. There are various options for dealing with
this in extension modules. However, from Python code the lack of
thread-safety means objects can't actually be shared, with a few
exceptions. Instead, a copy must be created, which means mutable
objects won't stay in sync.
By default, most objects are copied with :mod:`pickle` when they are
passed to another interpreter. Nearly all of the immutable builtin
objects are either directly shared or copied efficiently. For example:
* :const:`None`
* :class:`bool` (:const:`True` and :const:`False`)
* :class:`bytes`
* :class:`str`
* :class:`int`
* :class:`float`
* :class:`tuple` (of similarly supported objects)
There is a small number of Python types that actually share mutable
data between interpreters:
* :class:`memoryview`
* :class:`Queue`
Reference
---------
This module defines the following functions:
.. function:: list_all()
Return a :class:`list` of :class:`Interpreter` objects,
one for each existing interpreter.
.. function:: get_current()
Return an :class:`Interpreter` object for the currently running
interpreter.
.. function:: get_main()
Return an :class:`Interpreter` object for the main interpreter.
This is the interpreter the runtime created to run the :term:`REPL`
or the script given at the command-line. It is usually the only one.
.. function:: create()
Initialize a new (idle) Python interpreter
and return a :class:`Interpreter` object for it.
.. function:: create_queue()
Initialize a new cross-interpreter queue and return a :class:`Queue`
object for it.
Interpreter objects
^^^^^^^^^^^^^^^^^^^
.. class:: Interpreter(id)
A single interpreter in the current process.
Generally, :class:`Interpreter` shouldn't be called directly.
Instead, use :func:`create` or one of the other module functions.
.. attribute:: id
(read-only)
The underlying interpreter's ID.
.. attribute:: whence
(read-only)
A string describing where the interpreter came from.
.. method:: is_running()
Return ``True`` if the interpreter is currently executing code
in its :mod:`!__main__` module and ``False`` otherwise.
.. method:: close()
Finalize and destroy the interpreter.
.. method:: prepare_main(ns=None, **kwargs)
Bind objects in the interpreter's :mod:`!__main__` module.
Some objects are actually shared and some are copied efficiently,
but most are copied via :mod:`pickle`. See :ref:`interp-object-sharing`.
.. method:: exec(code, /, dedent=True)
Run the given source code in the interpreter (in the current thread).
.. method:: call(callable, /, *args, **kwargs)
Return the result of calling running the given function in the
interpreter (in the current thread).
.. _interp-call-in-thread:
.. method:: call_in_thread(callable, /, *args, **kwargs)
Run the given function in the interpreter (in a new thread).
Exceptions
^^^^^^^^^^
.. exception:: InterpreterError
This exception, a subclass of :exc:`Exception`, is raised when
an interpreter-related error happens.
.. exception:: InterpreterNotFoundError
This exception, a subclass of :exc:`InterpreterError`, is raised when
the targeted interpreter no longer exists.
.. exception:: ExecutionFailed
This exception, a subclass of :exc:`InterpreterError`, is raised when
the running code raised an uncaught exception.
.. attribute:: excinfo
A basic snapshot of the exception raised in the other interpreter.
.. XXX Document the excinfoattrs?
.. exception:: NotShareableError
This exception, a subclass of :exc:`TypeError`, is raised when
an object cannot be sent to another interpreter.
Communicating Between Interpreters
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.. class:: Queue(id)
A wrapper around a low-level, cross-interpreter queue, which
implements the :class:`queue.Queue` interface. The underlying queue
can only be created through :func:`create_queue`.
Some objects are actually shared and some are copied efficiently,
but most are copied via :mod:`pickle`. See :ref:`interp-object-sharing`.
.. attribute:: id
(read-only)
The queue's ID.
.. exception:: QueueEmptyError
This exception, a subclass of :exc:`queue.Empty`, is raised from
:meth:`!Queue.get` and :meth:`!Queue.get_nowait` when the queue
is empty.
.. exception:: QueueFullError
This exception, a subclass of :exc:`queue.Full`, is raised from
:meth:`!Queue.put` and :meth:`!Queue.put_nowait` when the queue
is full.
Basic usage
-----------
Creating an interpreter and running code in it::
from concurrent import interpreters
interp = interpreters.create()
# Run in the current OS thread.
interp.exec('print("spam!")')
interp.exec("""if True:
print('spam!')
""")
from textwrap import dedent
interp.exec(dedent("""
print('spam!')
"""))
def run(arg):
return arg
res = interp.call(run, 'spam!')
print(res)
def run():
print('spam!')
interp.call(run)
# Run in new OS thread.
t = interp.call_in_thread(run)
t.join()
|