File: write.rst

package info (click to toggle)
fdb 5.7.0-5
  • links: PTS, VCS
  • area: main
  • in suites: bullseye
  • size: 2,224 kB
  • sloc: cpp: 23,293; python: 141; sh: 115; makefile: 25; ansic: 8
file content (82 lines) | stat: -rw-r--r-- 5,887 bytes parent folder | download | duplicates (4)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
fdb-write
=========

| Inserts data into the FDB, creating a new databases if needed.  
| The data is copied into the FDB, and the tool reports the location where it was inserted.  
| This process is atomic and can be run concurrently to other processes reading or writing to the same FDB databases.

Usage
-----

``fdb-write [options] <gribfile1> [gribfile2] ...``

Options
-------

+----------------------------------------+-----------------------------------------------------------------------------------------+
| ``--verbose``	                         | Prints more information, namely the key of each datum and the                           |
|                                        | information of which data was filtered out                                              |
+----------------------------------------+-----------------------------------------------------------------------------------------+
| ``--statistics``                       | Report timing statistics                                                                |
+----------------------------------------+-----------------------------------------------------------------------------------------+
| ``--include-filter=string=string,...`` | | Filter out any data that **does not match** this key-value pairs.                     |
|                                        | | Key-value pairs can be in the form of a MARS request, e.g.: ``step=141/to/240/by/3``  |
+----------------------------------------+-----------------------------------------------------------------------------------------+
| ``--exclude-filter=string=string,...`` | | Filter out any data that **does match** this key-value pair.                          | 
|                                        | | Key-value pairs can be in the form of a MARS request, e.g.: ``levelist=850/1000``     |
+----------------------------------------+-----------------------------------------------------------------------------------------+

Examples
--------

You may pass multiple grib files. The tool will insert them sequentially.
::

  % fdb-write data.grib
  
  Processing data.grib
  FDB archive 12 fields, size 37.5412 Mbytes, in 0.088939 second (422.091 Mbytes per second)
  fdb::service::archive: 0.089006 second elapsed, 0.089005 second cpu



Use ``--include-filter`` you have a large data set of which you want to write a small sub-set easily identifiable from a few keys ...
::

  % fdb-write --verbose --include-filter=time=0000 data.grib
 
  Processing data.grib
  Archiving {class=rd,date=20160402,domain=g,expver=xxxx,levelist=1000,levtype=pl,param=138,step=0,stream=oper,time=0000,type=an}
  Archiving {class=rd,date=20160402,domain=g,expver=xxxx,levelist=1000,levtype=pl,param=155,step=0,stream=oper,time=0000,type=an}
  Archiving {class=rd,date=20160402,domain=g,expver=xxxx,levelist=850,levtype=pl,param=138,step=0,stream=oper,time=0000,type=an}
  Archiving {class=rd,date=20160402,domain=g,expver=xxxx,levelist=850,levtype=pl,param=155,step=0,stream=oper,time=0000,type=an}
  Include key {time=0000} filtered out datum {class=rd,date=20160402,domain=g,expver=xxxx,levelist=1000,levtype=pl,param=138,step=0,stream=oper,time=1200,type=an}
  Include key {time=0000} filtered out datum {class=rd,date=20160402,domain=g,expver=xxxx,levelist=1000,levtype=pl,param=155,step=0,stream=oper,time=1200,type=an}
  Include key {time=0000} filtered out datum {class=rd,date=20160402,domain=g,expver=xxxx,levelist=850,levtype=pl,param=138,step=0,stream=oper,time=1200,type=an}
  Include key {time=0000} filtered out datum {class=rd,date=20160402,domain=g,expver=xxxx,levelist=850,levtype=pl,param=155,step=0,stream=oper,time=1200,type=an}
  Archiving {class=rd,date=20160401,domain=g,expver=xxxx,levelist=1000,levtype=pl,param=138,step=0,stream=oper,time=0000,type=an}
  ...
  FDB archive 8 fields, size 25.0275 Mbytes, in 0.129475 second (193.301 Mbytes per seconds)
  fdb::service::archive: 0.129522 second elapsed, 0.129514 second cpu



Use ``--exclude-filter`` you have a large data set of which you want to filter out a small sub-set easily identifiable from a few keys ...
::

  % fdb-write --verbose --exclude-filter=time=1200,levelist=1000 data.grib
  
  Processing data.grib
  Archiving {class=rd,date=20160402,domain=g,expver=xxxx,levelist=1000,levtype=pl,param=138,step=0,stream=oper,time=0000,type=an}
  Archiving {class=rd,date=20160402,domain=g,expver=xxxx,levelist=1000,levtype=pl,param=155,step=0,stream=oper,time=0000,type=an}
  Archiving {class=rd,date=20160402,domain=g,expver=xxxx,levelist=850,levtype=pl,param=138,step=0,stream=oper,time=0000,type=an}
  Archiving {class=rd,date=20160402,domain=g,expver=xxxx,levelist=850,levtype=pl,param=155,step=0,stream=oper,time=0000,type=an}
  Exclude key {time=1200,levelist=1000} filtered out datum {class=rd,date=20160402,domain=g,expver=xxxx,levelist=1000,levtype=pl,param=138,step=0,stream=oper,time=1200,type=an}
  Exclude key {time=1200,levelist=1000} filtered out datum {class=rd,date=20160402,domain=g,expver=xxxx,levelist=1000,levtype=pl,param=155,step=0,stream=oper,time=1200,type=an}
  Archiving {class=rd,date=20160402,domain=g,expver=xxxx,levelist=850,levtype=pl,param=138,step=0,stream=oper,time=1200,type=an}
  Archiving {class=rd,date=20160401,domain=g,expver=xxxx,levelist=850,levtype=pl,param=155,step=0,stream=oper,time=0000,type=an}
  Exclude key {time=1200,levelist=1000} filtered out datum {class=rd,date=20160401,domain=g,expver=xxxx,levelist=1000,levtype=pl,param=138,step=0,stream=oper,time=1200,type=an}
  Exclude key {time=1200,levelist=1000} filtered out datum {class=rd,date=20160401,domain=g,expver=xxxx,levelist=1000,levtype=pl,param=155,step=0,stream=oper,time=1200,type=an}
  ...
  FDB archive 12 fields, size 37.5412 Mbytes, in 0.160719 second (233.584 Mbytes per seconds)
  fdb::service::archive: 0.160764 second elapsed, 0.160724 second cpu