File: magic.py

package info (click to toggle)
gammapy 2.0.1-1
  • links: PTS, VCS
  • area: main
  • in suites: forky, sid
  • size: 10,800 kB
  • sloc: python: 81,999; makefile: 211; sh: 11; javascript: 10
file content (383 lines) | stat: -rw-r--r-- 13,795 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
"""
MAGIC with Gammapy
==================

Explore the MAGIC IRFs and perform a point like spectral analysis with energy dependent offset cut.


Introduction
------------

`MAGIC <https://magic.mpp.mpg.de/>`__ (Major Atmospheric Gamma Imaging
Cherenkov Telescopes) consists of two Imaging Atmospheric Cherenkov telescopes in La Palma, Spain.
These 17m diameter telescopes detect gamma-rays from ~ 30 GeV to 100 TeV.
The MAGIC public data release contains around 100 hours of data and can be found `here <https://opendata.magic.pic.es/>`__.
This notebook presents an analysis based on just two runs of the Crab Nebula.
It provides an introduction to using the MAGIC DL3 data products, to produce a
`~gammapy.datasets.SpectrumDatasetOnOff`. Importantly it shows how to perform a data reduction
with energy-dependent directional cuts, as described further below.
For further information, see `this paper <https://ui.adsabs.harvard.edu/abs/2024JHEAp..44..266A/abstract>`__.

Prerequisites
-------------

-  Understanding the basic data reduction performed in the
   :doc:`/tutorials/analysis-1d/spectral_analysis` tutorial.
-  understanding the difference between a
   `point-like <https://gamma-astro-data-formats.readthedocs.io/en/latest/irfs/point_like/index.html>`__
   and a
   `full-enclosure <https://gamma-astro-data-formats.readthedocs.io/en/latest/irfs/full_enclosure/index.html>`__
   IRF.


Context
-------

As described in the :doc:`/tutorials/analysis-1d/spectral_analysis`
tutorial, the background is estimated from the field of view (FoV) of the observation.
Since the MAGIC data release does not include a dedicated background IRF, this estimation
must be performed directly from the FoV.
In particular, the source and background events are counted within a circular
ON region enclosing the source. The background to be subtracted is then estimated
from one or more OFF regions with an expected background rate similar to the one
in the ON region (i.e. from regions with similar acceptance).

*Full-containment* IRFs have no directional cut applied, when employed
for a 1D analysis, it is required to apply a correction to the IRF
accounting for flux leaking out of the ON region. This correction is
typically obtained by integrating the PSF within the ON region.

When computing a *point-like* IRFs, a directional cut around the assumed
source position is applied to the simulated events. For this IRF type,
no PSF component is provided. The size of the ON and OFF regions used
for the spectrum extraction should then reflect this cut, since a
response computed within a specific region around the source is being
provided.

The directional cut is typically an angular distance from the assumed
source position, :math:`\\theta`. The
`gamma-astro-data-format <https://gamma-astro-data-formats.readthedocs.io/en/latest/>`__
specifications offer two different ways to store this information:

* if the same :math:`\\theta` cut is applied at all energies and offsets, a
  `RAD_MAX <https://gamma-astro-data-formats.readthedocs.io/en/latest/irfs/point_like/#rad-max>`__
  keyword is added to the header of the data units containing IRF components. This
  should be used to define the size of the ON and OFF regions;
* in case an energy-dependent (and offset-dependent) :math:`\\theta` cut is applied, its
  values are stored in additional ``FITS`` data unit, named
  `RAD_MAX_2D <https://gamma-astro-data-formats.readthedocs.io/en/latest/irfs/point_like/#rad-max-2d>`__.

Gammapy provides a class to automatically read these values,
`~gammapy.irf.RadMax2D`, for both cases (fixed or energy-dependent
:math:`\\theta` cut). In this notebook we will focus on how to perform a
spectral extraction with a point-like IRF with an energy-dependent
:math:`\\theta` cut. We remark that in this case a
`~regions.PointSkyRegion` (and not a `~regions.CircleSkyRegion`)
should be used to define the ON region. If a geometry based on a
`~regions.PointSkyRegion` is fed to the spectra and the background
`~gammapy.makers.Maker`, the latter will automatically use the values stored in the
``RAD_MAX`` keyword / table for defining the size of the ON and OFF
regions.

Beside the definition of the ON region during the data reduction, the
remaining steps are identical to the other :doc:`/tutorials/analysis-1d/spectral_analysis`
tutorial, so we will not detail the data reduction steps already
presented in the other tutorial.

**Objective: perform the data reduction and analysis of 2 Crab Nebula
observations of MAGIC and fit the resulting datasets.**

"""

######################################################################
# Setup
# -----
#
# As usual, we’ll start with some setup …
#
from IPython.display import display
import astropy.units as u
from astropy.coordinates import SkyCoord
from regions import PointSkyRegion

# %matplotlib inline
import matplotlib.pyplot as plt

from gammapy.data import DataStore
from gammapy.datasets import Datasets, SpectrumDataset
from gammapy.makers import (
    ReflectedRegionsBackgroundMaker,
    SafeMaskMaker,
    SpectrumDatasetMaker,
    WobbleRegionsFinder,
)
from gammapy.maps import Map, MapAxis, RegionGeom
from gammapy.modeling import Fit
from gammapy.modeling.models import (
    LogParabolaSpectralModel,
    SkyModel,
    create_crab_spectral_model,
)
from gammapy.visualization import plot_spectrum_datasets_off_regions


######################################################################
# Load data
# ---------
#
# We load the two MAGIC observations of the Crab Nebula which contain a
# ``RAD_MAX_2D`` table.
#

data_store = DataStore.from_dir("$GAMMAPY_DATA/magic/rad_max/data")
observations = data_store.get_observations(required_irf="point-like")

######################################################################
# We can take a look at the MAGIC IRFs:
#

observations[0].peek()
plt.show()
# sphinx_gallery_thumbnail_number = 1

######################################################################
# The ``rad_max`` attribute, containing the ``RAD_MAX_2D`` table, is
# automatically loaded in the observation. As we can see from the IRF
# component axes, the table has a single offset value and 28 estimated
# energy values.
#

rad_max = observations["5029747"].rad_max
print(rad_max)


######################################################################
# Plotting the rad max value against the energy:
#

fig, ax = plt.subplots()
rad_max.plot_rad_max_vs_energy(ax=ax)
plt.show()


######################################################################
# Define the ON region
# --------------------
#
# To use the ``RAD_MAX_2D`` values to define the sizes of the ON and OFF
# regions it is necessary to specify the ON region as
# a `~regions.PointSkyRegion`  i.e. we specify only the center of our ON region.
#

target_position = SkyCoord(ra=83.63, dec=22.01, unit="deg", frame="icrs")
on_region = PointSkyRegion(target_position)


######################################################################
# Run data reduction chain
# ------------------------
#
# We begin by configuring the dataset maker classes.
# First, we define the reconstructed and true energy axes:

energy_axis = MapAxis.from_energy_bounds(
    50, 1e5, nbin=5, per_decade=True, unit="GeV", name="energy"
)
energy_axis_true = MapAxis.from_energy_bounds(
    10, 1e5, nbin=10, per_decade=True, unit="GeV", name="energy_true"
)

######################################################################
# We create a `~gammapy.maps.RegionGeom` by combining the ON region with the
# estimated energy axis of the `~gammapy.datasets.SpectrumDataset` we want to produce.
# This geometry in used to create the `~gammapy.datasets.SpectrumDataset`.

geom = RegionGeom.create(region=on_region, axes=[energy_axis])

dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=energy_axis_true)


######################################################################
# The `~gammapy.makers.SpectrumDatasetMaker` and `~gammapy.makers.ReflectedRegionsBackgroundMaker`
# will utilise the :math:`\theta` values in `~gammapy.data.Observation.rad_max` to define
# the sizes of the OFF regions.
#

dataset_maker = SpectrumDatasetMaker(
    containment_correction=False, selection=["counts", "exposure", "edisp"]
)

######################################################################
# In order to define the OFF regions it is recommended to use a
# `~gammapy.makers.WobbleRegionsFinder`, that uses fixed positions for
# the OFF regions. In the different estimated energy bins we will have OFF
# regions centered at the same positions, but with changing size.
#
# The parameter ``n_off_regions`` specifies the number of OFF regions to be considered.
# In this case we use 3.
#

region_finder = WobbleRegionsFinder(n_off_regions=3)
bkg_maker = ReflectedRegionsBackgroundMaker(region_finder=region_finder)

######################################################################
# Use the energy threshold specified in the DL3 files for the safe mask:

safe_mask_masker = SafeMaskMaker(methods=["aeff-default"])

datasets = Datasets()

######################################################################
# Create a counts map for visualisation later:

counts = Map.create(skydir=target_position, width=3)


######################################################################
# Perform the data reduction loop:

for observation in observations:
    dataset = dataset_maker.run(
        dataset_empty.copy(name=str(observation.obs_id)), observation
    )
    counts.fill_events(observation.events)
    dataset_on_off = bkg_maker.run(dataset, observation)
    dataset_on_off = safe_mask_masker.run(dataset_on_off, observation)
    datasets.append(dataset_on_off)


######################################################################
# Now we can plot the off regions and target positions on top of the counts
# map:
#

ax = counts.plot(cmap="viridis")
geom.plot_region(ax=ax, kwargs_point={"color": "k", "marker": "*"})
plot_spectrum_datasets_off_regions(ax=ax, datasets=datasets)
plt.show()


######################################################################
# Fit spectrum
# ------------
#
# We perform a joint likelihood fit of the two datasets. For this particular datasets
# we select a fit range between :math:`80\,{\rm GeV}` and :math:`20\,{\rm TeV}`.
#

e_min = 80 * u.GeV
e_max = 20 * u.TeV

for dataset in datasets:
    dataset.mask_fit = dataset.counts.geom.energy_mask(e_min, e_max)

spectral_model = LogParabolaSpectralModel(
    amplitude=1e-12 * u.Unit("cm-2 s-1 TeV-1"),
    alpha=2,
    beta=0.1,
    reference=1 * u.TeV,
)
model = SkyModel(spectral_model=spectral_model, name="crab")

datasets.models = [model]

fit = Fit()
result = fit.run(datasets=datasets)

# we make a copy here to compare it later
best_fit_model = model.copy()


######################################################################
# Fit quality and model residuals
# -------------------------------
#


######################################################################
# We can access the results dictionary to see if the fit converged:
#

print(result)


######################################################################
# and check the best-fit parameters
#

display(datasets.models.to_parameters_table())


######################################################################
# A simple way to inspect the model residuals is using the function
# `~SpectrumDatasetOnOff.plot_fit()`
#
ax_spectrum, ax_residuals = datasets[0].plot_fit()
ax_spectrum.set_ylim(0.1, 120)
plt.show()


######################################################################
# For more ways of assessing fit quality, please refer to the dedicated
# :doc:`/tutorials/details/fitting` tutorial.
#


######################################################################
# Compare against the literature
# ------------------------------
#
# Let us compare the spectrum we obtained against a `previous measurement
# by
# MAGIC <https://ui.adsabs.harvard.edu/abs/2015JHEAp...5...30A/abstract>`__.
#
fig, ax = plt.subplots()
plot_kwargs = {
    "energy_bounds": [0.08, 20] * u.TeV,
    "sed_type": "e2dnde",
    "ax": ax,
}
ax.yaxis.set_units(u.Unit("TeV cm-2 s-1"))
ax.xaxis.set_units(u.Unit("GeV"))
crab_magic_lp = create_crab_spectral_model("magic_lp")

best_fit_model.spectral_model.plot(
    ls="-", lw=1.5, color="crimson", label="best fit", **plot_kwargs
)
best_fit_model.spectral_model.plot_error(facecolor="crimson", alpha=0.4, **plot_kwargs)
crab_magic_lp.plot(ls="--", lw=1.5, color="k", label="MAGIC reference", **plot_kwargs)

ax.legend()
ax.set_ylim([1e-13, 1e-10])
plt.show()


######################################################################
# .. _magic-dataset_sims:
#
# Dataset simulations
# -------------------
#
# A common way to check if a fit is biased is to simulate multiple datasets with
# the obtained best fit model, and check the distribution of the fitted parameters.
# Here, we show how to perform one such simulation assuming the measured off counts
# provide a good distribution of the background.
#

dataset_simulated = datasets.stack_reduce().copy(name="simulated_ds")
simulated_model = best_fit_model.copy(name="simulated")
dataset_simulated.models = simulated_model
dataset_simulated.fake(
    npred_background=dataset_simulated.counts_off * dataset_simulated.alpha
)
dataset_simulated.peek()
plt.show()


######################################################################
# The important thing to note here is that while this samples the on-counts, the off counts are
# not sampled. If you have multiple measurements of the off counts, they should be used.
# Alternatively, you can try to create a parametric model of the background.

result = fit.run(datasets=[dataset_simulated])
print(result.models.to_parameters_table())