Author: Andreas Tille <tille@debian.org>
Last-Update: Thu, 23 Jun 2022 10:34:56 +0200
Description: Make sure the doc applies to Debian installations with Python3

--- toil.orig/docs/gettingStarted/install.rst
+++ toil/docs/gettingStarted/install.rst
@@ -35,7 +35,7 @@
     $ curl -O https://pypi.python.org/packages/d4/0c/9840c08189e030873387a73b90ada981885010dd9aea134d6de30cd24cb8/virtualenv-15.1.0.tar.gz
     $ tar xvfz virtualenv-15.1.0.tar.gz
     $ cd virtualenv-15.1.0
-    $ python virtualenv.py ~/venv
+    $ python3 virtualenv.py ~/venv
 
 Now that you've created your virtualenv, activate your virtual environment::
 
--- toil.orig/docs/gettingStarted/quickStart.rst
+++ toil/docs/gettingStarted/quickStart.rst
@@ -16,7 +16,7 @@
 
 2. Specify the name of the :ref:`job store <jobStoreOverview>` and run the workflow::
 
-       python helloWorld.py file:my-job-store
+       python3 helloWorld.py file:my-job-store
 
 Congratulations! You've run your first Toil workflow using the default :ref:`Batch System <batchsysteminterface>`, ``singleMachine``,
 using the ``file`` job store.
@@ -28,7 +28,7 @@
 Toil supports many different batch systems such as `Apache Mesos`_ and Grid Engine; its versatility makes it
 easy to run your workflow in all kinds of places.
 
-Toil is totally customizable! Run ``python helloWorld.py --help`` to see a complete list of available options.
+Toil is totally customizable! Run ``python3 helloWorld.py --help`` to see a complete list of available options.
 
 For something beyond a "Hello, world!" example, refer to :ref:`runningDetail`.
 
@@ -140,7 +140,7 @@
 
 #. Run it with the default settings::
 
-      $ python sort.py file:jobStore
+      $ python3 sort.py file:jobStore
 
    The workflow created a file called ``sortedFile.txt`` in your current directory.
    Have a look at it and notice that it contains a whole lot of sorted lines!
@@ -157,7 +157,7 @@
 
 3. Run with custom options::
 
-      $ python sort.py file:jobStore \
+      $ python3 sort.py file:jobStore \
                    --numLines=5000 \
                    --lineLength=10 \
                    --overwriteOutput=True \
@@ -262,7 +262,7 @@
 boilerplate ensures that the pipeline is only started once---on the leader---but
 not when its job functions are imported and executed on the individual workers.
 
-Typing ``python sort.py --help`` will show the complete list of
+Typing ``python3 sort.py --help`` will show the complete list of
 arguments for the workflow which includes both Toil's and ones defined inside
 ``sort.py``. A complete explanation of Toil's arguments can be
 found in :ref:`commandRef`.
@@ -276,7 +276,7 @@
 with the ``--logLevel`` flag. For example, to only log ``CRITICAL`` level
 messages to the screen::
 
-   $ python sort.py file:jobStore \
+   $ python3 sort.py file:jobStore \
                 --logLevel=critical \
                 --overwriteOutput=True
 
@@ -302,7 +302,7 @@
 
 When we run the pipeline, Toil will show a detailed failure log with a traceback::
 
-   $ python sort.py file:jobStore
+   $ python3 sort.py file:jobStore
    ...
    ---TOIL WORKER OUTPUT LOG---
    ...
@@ -324,13 +324,13 @@
 failure, the job store is preserved so that the workflow can be restarted,
 starting from the previously failed jobs. We can restart the pipeline by running ::
 
-   $ python sort.py file:jobStore \
+   $ python3 sort.py file:jobStore \
                 --restart \
                 --overwriteOutput=True
 
 We can also change the number of times Toil will attempt to retry a failed job::
 
-   $ python sort.py file:jobStore \
+   $ python3 sort.py file:jobStore \
                 --retryCount 2 \
                 --restart \
                 --overwriteOutput=True
@@ -344,7 +344,7 @@
 
 ::
 
-    $ python sort.py file:jobStore \
+    $ python3 sort.py file:jobStore \
                  --restart \
                  --overwriteOutput=True
 
@@ -393,7 +393,7 @@
 
 #. Run the Toil script in the cluster::
 
-        $ python /tmp/helloWorld.py aws:us-west-2:my-S3-bucket
+        $ python3 /tmp/helloWorld.py aws:us-west-2:my-S3-bucket
 
    In this particular case, we create an S3 bucket called ``my-S3-bucket`` in
    the ``us-west-2`` availability zone to store intermediate job results.
--- toil.orig/docs/running/cloud/amazon.rst
+++ toil/docs/running/cloud/amazon.rst
@@ -141,7 +141,7 @@
 
 To run the sort example :ref:`sort example <sortExample>` with the AWS job store you would type ::
 
-    $ python sort.py aws:us-west-2:my-aws-sort-jobstore
+    $ python3 sort.py aws:us-west-2:my-aws-sort-jobstore
 
 .. _installProvisioner:
 
@@ -290,7 +290,7 @@
 
 #. Run the script as an autoscaling workflow: ::
 
-    $ python /root/sort.py aws:us-west-2:<my-jobstore-name> \
+    $ python3 /root/sort.py aws:us-west-2:<my-jobstore-name> \
           --provisioner aws \
           --nodeTypes c3.large \
           --maxNodes 2 \
@@ -317,7 +317,7 @@
 
 For more information on other autoscaling (and other) options have a look at :ref:`workflowOptions` and/or run ::
 
-    $ python my-toil-script.py --help
+    $ python3 my-toil-script.py --help
 
 .. important::
 
--- toil.orig/docs/running/cloud/gce.rst
+++ toil/docs/running/cloud/gce.rst
@@ -58,7 +58,7 @@
 
 Then to run the sort example with the Google job store you would type ::
 
-    $ python sort.py google:my-project-id:my-google-sort-jobstore
+    $ python3 sort.py google:my-project-id:my-google-sort-jobstore
 
 Running a Workflow with Autoscaling
 -----------------------------------
@@ -93,7 +93,7 @@
 
 #. Run the workflow::
 
-    $ python /root/sort.py  google:<PROJECT-ID>:<JOBSTORE-NAME> \
+    $ python3 /root/sort.py  google:<PROJECT-ID>:<JOBSTORE-NAME> \
           --provisioner gce \
           --batchSystem mesos \
           --nodeTypes n1-standard-2 \
--- toil.orig/docs/appendices/deploy.rst
+++ toil/docs/appendices/deploy.rst
@@ -70,7 +70,7 @@
 
 We can now run our workflow::
 
-   $ python main.py --batchSystem=mesos …
+   $ python3 main.py --batchSystem=mesos …
 
 .. important::
 
@@ -79,9 +79,9 @@
 
 .. warning::
 
-   Neither ``python setup.py develop`` nor ``pip install -e .`` can be used in
+   Neither ``python3 setup.py develop`` nor ``pip install -e .`` can be used in
    this process as, instead of copying the source files, they create ``.egg-link``
-   files that Toil can't auto-deploy. Similarly, ``python setup.py install``
+   files that Toil can't auto-deploy. Similarly, ``python3 setup.py install``
    doesn't work either as it installs the project as a Python ``.egg`` which is
    also not currently supported by Toil (though it `could be`_ in the future).
 
@@ -143,7 +143,7 @@
        └── main.py
 
    3 directories, 5 files
-   $ python -m workflow.main --batchSystem=mesos …
+   $ python3 -m workflow.main --batchSystem=mesos …
 
 .. _package: https://docs.python.org/2/tutorial/modules.html#packages
 
@@ -151,7 +151,7 @@
 is part of a package called ``util``, in a subdirectory of the current
 directory. Additional functionality is in a separate module called
 ``util.sort.quick`` which corresponds to ``util/sort/quick.py``. Because we
-invoke the user module via ``python -m workflow.main``, Toil can determine the
+invoke the user module via ``python3 -m workflow.main``, Toil can determine the
 root directory of the hierarchy–``my_project`` in this case–and copy all Python
 modules underneath it to each worker. The ``-m`` option is documented `here`_
 
@@ -160,7 +160,7 @@
 When ``-m`` is passed, Python adds the current working directory to
 ``sys.path``, the list of root directories to be considered when resolving a
 module name like ``workflow.main``. Without that added convenience we'd have to
-run the workflow as ``PYTHONPATH="$PWD" python -m workflow.main``. This also
+run the workflow as ``PYTHONPATH="$PWD" python3 -m workflow.main``. This also
 means that Toil can detect the root directory of the user module's package
 hierarchy even if it isn't the current working directory. In other words we
 could do this::
@@ -168,7 +168,7 @@
    $ cd my_project
    $ export PYTHONPATH="$PWD"
    $ cd /some/other/dir
-   $ python -m workflow.main --batchSystem=mesos …
+   $ python3 -m workflow.main --batchSystem=mesos …
 
 Also note that the root directory itself must not be package, i.e. must not
 contain an ``__init__.py``.
--- toil.orig/docs/running/cliOptions.rst
+++ toil/docs/running/cliOptions.rst
@@ -17,8 +17,8 @@
 Running toil scripts requires a filepath or url to a centralizing location for all of the files of the workflow.
 This is Toil's one required positional argument: the job store.  To use the :ref:`quickstart <quickstart>` example,
 if you're on a node that has a large **/scratch** volume, you can specify that the jobstore be created there by
-executing: ``python HelloWorld.py /scratch/my-job-store``, or more explicitly,
-``python HelloWorld.py file:/scratch/my-job-store``.
+executing: ``python3 HelloWorld.py /scratch/my-job-store``, or more explicitly,
+``python3 HelloWorld.py file:/scratch/my-job-store``.
 
 Syntax for specifying different job stores:
 
--- toil.orig/docs/running/cloud/clusterUtils.rst
+++ toil/docs/running/cloud/clusterUtils.rst
@@ -58,7 +58,7 @@
 
 An example of this would be running the following::
 
-    python discoverfiles.py file:my-jobstore --stats
+    python3 discoverfiles.py file:my-jobstore --stats
 
 Where ``discoverfiles.py`` is the following:
 
@@ -112,7 +112,7 @@
 
 Continuing the example from the stats section above, if we ran our workflow with the command ::
 
-    python discoverfiles.py file:my-jobstore --stats
+    python3 discoverfiles.py file:my-jobstore --stats
 
 We could interrogate our jobstore with the status command, for example::
 
--- toil.orig/docs/contributing/contributing.rst
+++ toil/docs/contributing/contributing.rst
@@ -66,7 +66,7 @@
 This usually works as expected, but some tests need some manual preparation. To run a specific test with pytest,
 use the following::
 
-    python -m pytest src/toil/test/sort/sortTest.py::SortTest::testSort
+    python3 -m pytest src/toil/test/sort/sortTest.py::SortTest::testSort
 
 For more information, see the `pytest documentation`_.
 
