File: tutorials.rst

package info (click to toggle)
python-mne 0.13.1%2Bdfsg-3
  • links: PTS, VCS
  • area: main
  • in suites: stretch
  • size: 92,032 kB
  • ctags: 8,249
  • sloc: python: 84,750; makefile: 205; sh: 15
file content (192 lines) | stat: -rw-r--r-- 4,784 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
.. _tutorials:

Tutorials
=========

Once you have
:ref:`Python and MNE-Python up and running <install_python_and_mne_python>`,
you can use these tutorials to get started processing MEG/EEG.
You can find each step of the processing pipeline, and re-run the
Python code by copy-paste.

These tutorials aim to capture only the most important information.
For further reading:

- For a high-level overview of what you can do with MNE-Python:
  :ref:`what_can_you_do`
- For more examples of analyzing M/EEG data, including more sophisticated
  analysis: :ref:`general_examples`
- For details about analysis steps: :ref:`manual`
- For details about specific functions and classes: :ref:`api_reference`

.. note:: The default location for the MNE-sample data is
          my-path-to/mne-python/examples. If you downloaded data and an
          example asks you whether to download it again, make sure
          the data reside in the examples directory
          and that you run the script from its current directory.

          .. code-block:: bash

              $ cd examples/preprocessing

          Then in Python you can do::

              In [1]: %run plot_find_ecg_artifacts.py


          See :ref:`datasets` for a list of all available datasets and some
          advanced configuration options, e.g. to specify a custom
          location for storing the datasets.

.. container:: span box

  .. raw:: html

    <h2>Introduction to MNE and Python</h2>

  .. toctree::
    :maxdepth: 1

    auto_tutorials/plot_python_intro.rst
    tutorials/seven_stories_about_mne.rst
    auto_tutorials/plot_introduction.rst

.. container:: span box

  .. raw:: html

    <h2>Background information</h2>

  .. toctree::
    :maxdepth: 1

    auto_tutorials/plot_background_filtering.rst

.. container:: span box

  .. raw:: html

    <h2>Preprocessing</h2>

  .. toctree::
    :maxdepth: 1

    auto_tutorials/plot_artifacts_detection.rst
    auto_tutorials/plot_artifacts_correction_filtering.rst
    auto_tutorials/plot_artifacts_correction_rejection.rst
    auto_tutorials/plot_artifacts_correction_ssp.rst
    auto_tutorials/plot_artifacts_correction_ica.rst
    auto_tutorials/plot_artifacts_correction_maxwell_filtering.rst

.. container:: span box

  .. raw:: html

    <h2>Sensor-level analysis</h2>

  .. toctree::
    :maxdepth: 1

    auto_tutorials/plot_epoching_and_averaging.rst
    auto_tutorials/plot_eeg_erp.rst
    auto_tutorials/plot_sensors_time_frequency.rst
    auto_tutorials/plot_sensors_decoding.rst

.. container:: span box

  .. raw:: html

    <h2>Visualization and Reporting</h2>

  .. toctree::
    :maxdepth: 1

    auto_tutorials/plot_visualize_raw.rst
    auto_tutorials/plot_visualize_epochs.rst
    auto_tutorials/plot_visualize_evoked.rst
    tutorials/report.rst

.. container:: span box

  .. raw:: html

    <h2>Manipulating Data Structures and Containers</h2>

  .. toctree::
    :maxdepth: 1

    auto_tutorials/plot_object_raw.rst
    auto_tutorials/plot_modifying_data_inplace.rst
    auto_tutorials/plot_object_epochs.rst
    auto_tutorials/plot_object_evoked.rst
    auto_tutorials/plot_creating_data_structures.rst
    auto_tutorials/plot_info.rst

.. container:: span box

  .. raw:: html

    <h2>Source-level analysis</h2>

  .. toctree::
    :maxdepth: 1

    auto_tutorials/plot_forward.rst
    auto_tutorials/plot_compute_covariance.rst
    auto_tutorials/plot_mne_dspm_source_localization.rst
    auto_tutorials/plot_dipole_fit.rst
    auto_tutorials/plot_brainstorm_auditory.rst
    auto_tutorials/plot_brainstorm_phantom_ctf.rst
    auto_tutorials/plot_brainstorm_phantom_elekta.rst
    auto_tutorials/plot_point_spread.rst

.. container:: span box

  .. raw:: html

    <h2>Sensor-space Univariate Statistics</h2>

  .. toctree::
    :maxdepth: 1

    auto_tutorials/plot_stats_cluster_methods.rst
    auto_tutorials/plot_stats_spatio_temporal_cluster_sensors.rst
    auto_tutorials/plot_stats_cluster_1samp_test_time_frequency.rst
    auto_tutorials/plot_stats_cluster_time_frequency.rst

.. container:: span box

  .. raw:: html

    <h2>Source-space Univariate Statistics</h2>

  .. toctree::
    :maxdepth: 1

    auto_tutorials/plot_stats_cluster_time_frequency_repeated_measures_anova.rst
    auto_tutorials/plot_stats_cluster_spatio_temporal_2samp.rst
    auto_tutorials/plot_stats_cluster_spatio_temporal_repeated_measures_anova.rst
    auto_tutorials/plot_stats_cluster_spatio_temporal.rst

.. container:: span box

  .. raw:: html

    <h2>Multivariate Statistics - Decoding</h2>

  .. toctree::
    :maxdepth: 1

    auto_tutorials/plot_sensors_decoding.rst

.. container:: span box

  .. raw:: html

    <h2>Command line tools</h2>

  .. toctree::
    :maxdepth: 1

    tutorials/command_line.rst
    generated/commands.rst