File: pyslow5.md

package info (click to toggle)
libslow5lib 0.7.0%2Bdfsg-3
  • links: PTS, VCS
  • area: main
  • in suites: sid, trixie
  • size: 25,092 kB
  • sloc: ansic: 11,825; python: 1,179; sh: 547; makefile: 91; cpp: 40
file content (462 lines) | stat: -rw-r--r-- 15,602 bytes parent folder | download | duplicates (4)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
# pyslow5 python library

The slow5 python library (pyslow5) allows a user to read and write slow5/blow5 files.

## Installation

Initial setup and example info for environment

###### slow5lib needs python3.4.2 or higher.

If you only want to use the python library, then you can simply install using pip

Using a virtual environment (see below if you need to install python)

#### Optional zstd compression

You can optionally enable [*zstd* compression](https://facebook.github.io/zstd) support when building *slow5lib/pyslow5*. This requires __zstd 1.3 or higher development libraries__ installed on your system:

```sh
On Debian/Ubuntu : sudo apt-get libzstd1-dev
On Fedora/CentOS : sudo yum libzstd-devel
On OS X : brew install zstd
```

BLOW5 files compressed with *zstd* offer smaller file size and better performance compared to the default *zlib*. However, *zlib* runtime library is available by default on almost all distributions unlike *zstd* and thus files compressed with *zlib* will be more 'portable'.

```bash
python3 -m venv path/to/slow5libvenv
source path/to/slow5libvenv/bin/activate
python3 -m pip install --upgrade pip
python3 -m pip install setuptools cython numpy wheel
# do this separately, after the libs above
# zlib only build
python3 -m pip install pyslow5

# for zstd build, run the following
export PYSLOW5_ZSTD=1
python3 -m pip install pyslow5
```

### Dev install

```bash
# If your native python3 meets this requirement, you can use that, or use a
# specific version installed with deadsnakes below. If you install with deadsnakes,
# you will need to call that specific python, such as python3.8 or python3.9,
# in all the following commands until you create a virtual environment with venv.
# Then once activated, you can just use python3.

# To install a specific version of python, the deadsnakes ppa is a good place to start
# This is an example for installing python3.7
# you can then call that specific python version
# > python3.7 -m pip --version
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt-get update
sudo apt install python3.7 python3.7-dev python3.7-venv


# get zlib1g-dev
sudo apt-get update && sudo apt-get install -y zlib1g-dev

# Check with
python3 --version

# You will also need the python headers if you don't already have them installed.

sudo apt-get install python3-dev
```

Building and installing the python library.

```bash
python3 -m venv /path/to/slow5libvenv
source /path/to/slow5libvenv/bin/activate
python3 -m pip install --upgrade pip
python3 -m pip install setuptools cython numpy wheel

git clone git@github.com:hasindu2008/slow5lib.git
cd slow5lib
make

# CHOOSE A OR B:  
# (B is the cleanest method)
# |=======================================================================|
# |A. Install with pip if wheel is present, otherwise it uses setuptools  |
    python3 -m pip install . --use-feature=in-tree-build
# |=======================================================================|
# |B. Or build and install manually with setup.py                         |
# |build the package                                                      |
    python3 setup.py build
# |If all went well, install the package                                  |
    python3 setup.py install
# |=======================================================================|

# This should not require sudo if using a python virtual environment/venv
# confirm installation, and find pyslow5==<version>
python3 -m pip freeze

# Ensure slow5 library is working by running the basic tests
python3 ./python/example.py


# To Remove the library
python3 -m pip uninstall pyslow5
```

## Usage

### Reading/writing a file

#### `Open(FILE, mode, rec_press="zlib", sig_press="svb_zd", DEBUG=0)`:

The pyslow5 library has one main Class, `pyslow5.Open` which opens a slow5/blow5 (slow5 for easy reference) file for reading/writing.

`FILE`: the file or filepath of the slow5 file to open
`mode`: mode in which to open the file.
+ `r`= read only
+ `w`= write/overwrite
+ `a`= append

This is designed to mimic Python's native Open() to help users remember the syntax

To set the record and signal compression methods, use the following `rec_press` and `sig_press` optional args, however these are only used with `mode='w'`. Any append will use whatever is already set in the file.

Compression Options:

`rec_press`:
- "none"
- "zlib" [default]
- "zstd" [requires `export PYSLOW5_ZSTD=1` when building]

`sig_press`:
- "none"
- "svb_zd" [default]

Example:

```python
import pyslow5

# open file
s5 = pyslow5.Open('examples/example.slow5','r')
```

When opening a slow5 file for the first time, and index will be created and saved in the same directory as the file being read. This index will then be loaded. For files that already have an index, that index will be loaded.

#### `get_read_ids()`:

returns a list and total number of reads from the index.
If there is no index, it creates one first.

Example:

```python
read_ids, num_reads = s5.get_read_ids()

print(read_ids)
print("number of reads: {}".format(num_reads))
```

#### `seq_reads(pA=False, aux=None)`:

Access all reads sequentially in an opened slow5.
+ If readID is not found, `None` is returned.
+ pA = Bool for converting signal to picoamps.
+ aux = `str` '<attr_name>'/'all' or list of names of auxiliary fields added to return dictionary, `None` if `<attr_name>` not found
+ returns `dict` = dictionary of main fields for read_id, with any aux fields added

Example:

```python
# create generator
reads = s5.seq_reads()

# print all readIDs
for read in reads:
    print(read['read_id'])

# or use directly in a for loop
for read in s5.seq_reads(pA=True, aux='all'):
    print("read_id:", read['read_id'])
    print("read_group:", read['read_group'])
    print("digitisation:", read['digitisation'])
    print("offset:", read['offset'])
    print("range:", read['range'])
    print("sampling_rate:", read['sampling_rate'])
    print("len_raw_signal:", read['len_raw_signal'])
    print("signal:", read['signal'][:10])
    print("================================")
```


#### `seq_reads_multi(threads=4, batchsize=4096, pA=False, aux=None)`:

Access all reads sequentially in an opened slow5, using multiple threads.
+ If readID is not found, `None` is returned.
+ threads = number of threads to use in C backend.
+ batchsize = number of reads to fetch at a time. Higher numbers use more ram, but is more efficient with more threads.
+ pA = Bool for converting signal to picoamps.
+ aux = `str` '<attr_name>'/'all' or list of names of auxiliary fields added to return dictionary, `None` if `<attr_name>` not found
+ returns `dict` = dictionary of main fields for read_id, with any aux fields added

Example:

```python
# create generator
reads = s5.seq_reads_multi(threads=2, batchsize=3)

# print all readIDs
for read in reads:
    print(read['read_id'])

# or use directly in a for loop
for read in s5.seq_reads_multi(threads=2, batchsize=3, pA=True, aux='all'):
    print("read_id:", read['read_id'])
    print("read_group:", read['read_group'])
    print("digitisation:", read['digitisation'])
    print("offset:", read['offset'])
    print("range:", read['range'])
    print("sampling_rate:", read['sampling_rate'])
    print("len_raw_signal:", read['len_raw_signal'])
    print("signal:", read['signal'][:10])
    print("================================")
```

#### `get_read(readID, pA=False, aux=None)`:

Access a specific read using a unique readID. This is a ranom access method, using the index.
+ If readID is not found, `None` is returned.
+ pA = Bool for converting signal to picoamps.
+ aux = `str` '<attr_name>'/'all' or list of names of auxiliary fields added to return dictionary, `None` if `<attr_name>` not found
+ returns `dict` = dictionary of main fields for read_id, with any aux fields added

Example:

```python
readID = "r1"
read = s5.get_read(readID, pA=True, aux=["read_number", "start_mux"])
if read is not None:
    print("read_id:", read['read_id'])
    print("len_raw_signal:", read['len_raw_signal'])
```


#### `get_read_list(read_list, pA=False, aux=None)`:

Access a list of specific reads using a list `read_list` of unique readIDs. This is a random access method using the index. If an index does not exist, it will create one first.
+ If readID is not found, `None` is returned.
+ pA = Bool for converting signal to picoamps.
+ aux = `str` '<attr_name>'/'all' or list of names of auxiliary fields added to return dictionary, `None` if `<attr_name>` not found
+ returns `dict` = dictionary of main fields for read_id, with any aux fields added

Example:

```python
read_list = ["r1", "r3", "null_read", "r5", "r2", "r1"]
selected_reads = s5.get_read_list(read_list)
for r, read in zip(read_list,selected_reads):
    if read is not None:
        print(r, read['read_id'])
    else:
        print(r, "read not found")
```


#### `get_read_list_multi(read_list, threads=4, batchsize=100, pA=False, aux=None):`:

Access a list of specific reads using a list `read_list` of unique readIDs using multiple threads. This is a random access method using the index. If an index does not exist, it will create one first.
+ If readID is not found, `None` is returned.
+ threads = number of threads to use in C backend
+ batchsize = number of reads to fetch at a time. Higher numbers use more ram, but is more efficient with more threads.
+ pA = Bool for converting signal to picoamps.
+ aux = `str` '<attr_name>'/'all' or list of names of auxiliary fields added to return dictionary, `None` if `<attr_name>` not found
+ returns `dict` = dictionary of main fields for read_id, with any aux fields added
Example:

```python
read_list = ["r1", "r3", "null_read", "r5", "r2", "r1"]
selected_reads = s5.get_read_list_multi(read_list, threads=2, batchsize=3)
for r, read in zip(read_list, selected_reads):
    if read is not None:
        print(r, read['read_id'])
    else:
        print(r, "read not found")
```


#### `get_header_names()`:

Returns a list containing the uninon of header names from all read_groups

#### `get_header_value(attr, read_group=0)`:

Returns a `str` of the value of a header attribute (`attr`) for a particular read_group.
Returns `None` if value can't be found

#### `get_all_headers(read_group=0)`:

Returns a dictionary with all header attributes and values for a particular read_group
If there are values present for one read_group, and not for another, the attribute will still be returned for the read_group without, but with a value of `None`.

#### `get_aux_names()`:

Returns an ordered list of auxiliary attribute names. (same order as get_aux_types())

This is used for understanding which auxiliary attributes are available within the slow5 file, and providing selections to the `aux` keyword argument in the above functoions

#### `get_aux_types()`:

Returns an ordered list of auxiliary attribute types (same order as get_aux_names())

This can mostly be ignored, but will be used in error tracing in the future, as auxiliary field requests have multiple types, each with their own calls, and not all are used. It could be the case a call for an auxiliary filed fails, and knowing which type the field is requesting is very helpful in understanding which function in C is being called, that could be causing the error.

### Writing a file

To write a file, `mode` in `Open()` must be set to `'w'` and when appending, `'a'`

#### `get_empty_header()`:

Returns a dictionary containing all known header attributes with their values set to `None`.

User can modify each value, and add or remove attributes to be used has header items.
All values end up stored as strings, and anything left as `None` will be skipped.
To write header, see `write_header()`

Example:

```python
s5 = slow5.Open(file,'w')
header = s5.get_empty_header()
```

#### `write_header(header, read_group=0)`:

Write header to file

+ `header` = populated dictionary from `get_empty_header()`
+ read_group = read group integer for when multiple runs are written to the same slow5 file
+ returns 0 on success, <0 on error with error code

You must write `read_group=0` (default) first before writing any other read_groups, and it is advised to write read_groups in sequential order.

Example:

```python
# Get some empty headers
header = s5.get_empty_header()
header2 = s5.get_empty_header()

# Populate headers with some test data
counter = 0
for i in header:
    header[i] = "test_{}".format(counter)
    counter += 1

for i in header2:
    header2[i] = "test_{}".format(counter)
    counter += 1

# Write first read group
ret = s5.write_header(header)
print("ret: write_header(): {}".format(ret))
# Write second read group, etc
ret = s5.write_header(header2, read_group=1)
print("ret: write_header(): {}".format(ret))
```

#### `get_empty_record(aux=False)`:

Get empty read record for populating with data. Use with `write_record()`

+ aux = Bool for returning empty aux dictionary as well as read dictionary
+ returns a single read dictionary or a read and aux dictionary depending on aux flag

Example:
```python
# open some file to read. We will copy the data then write it
# including aux fields
s5_read = slow5.Open(read_file,'r')
reads = s5_read.seq_reads(aux='all')

# For each read in s5_read...
for read in reads:
    # get an empty record and aux dictionary
    record, aux = s5.get_empty_record(aux=True)
    # for each field in read...
    for i in read:
        # if the field is in the record dictionary...
        if i in record:
            # copy the value over...
            record[i] = read[i]
        do same for aux dictionary
        if i in aux:
            aux[i] = read[i]
    # write the record
    ret = s5.write_record(record, aux)
    print("ret: write_record(): {}".format(ret))
```

#### `write_record(record, aux=None)`:

Write a record and optional aux fields.

+ record = a populated dictionary from `get_empty_record()`
+ aux = an empty aux record returned by `get_empty_record(aux=True)`
+ returns 0 on success and -1 on error/failure

Example:

```python

record, aux = s5.get_empty_record(aux=True)
# populate record, aux dictionaries
#....
# Write record
ret = s5.write_record(record, aux)
print("ret: write_record(): {}".format(ret))
```


#### `write_record_batch(records, threads=4, batchsize=4096, aux=None)`:

Write a record and optional aux fields, using multiple threads

+ records = a dictionary of dictionaries where each entry is a populated form of `get_empty_record()` with the key of each being the read['read_id'].
+ threads = number of threads to use in the C backend.
+ batchsize = number of reads to write at a time. If parsing 1000 records, with batchsize=250 and threads=4, 4 threads will be spawned 4 times to write 250 records to the file before returning
+ aux = an empty aux record returned by `get_empty_record(aux=True)`
+ returns 0 on success and -1 on error/failure

Example:

```python

record, aux = s5.get_empty_record(aux=True)
# populate record, aux
#....
records[record['read_id']] = record
auxs[record['read_id']] = aux
# Write record
ret = s5.write_record_batch(records, threads=2, batchsize=3, aux=auxs)
print("ret: write_record(): {}".format(ret))
```

#### `close()`:

Closes a record open for writing or appending, and writes an End Of File (EOF) flag.

If not explicitly closed, when the `s5` object goes out of context in python, it will also trigger a close to attempt to avoid having a missing EOF.

Please call this when you are finished writing a file.

Example:

```python
s5 = slow5.Open(file,'w')

# do some writing....

# Write's EOF and closes file
s5.close()
```