File: cnrun-lua-api.texi

package info (click to toggle)
cnrun 2.1.0-4
  • links: PTS, VCS
  • area: main
  • in suites: forky, sid
  • size: 724 kB
  • sloc: cpp: 7,849; makefile: 206
file content (533 lines) | stat: -rw-r--r-- 15,673 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
\input texinfo @c -*-texinfo-*-
@c %**start of header
@setfilename cnrun-lua-api.info
@settitle CNrun Lua API
@c %**end of header

@include version.texi

@dircategory Libraries
@direntry
* CNrun Lua API: (cnrun-lua).        CNrun API in Lua.
@end direntry

@copying

Copyright @copyright{} 2014 Andrei Zavada @email{johnhommer@@gmail.com}.

The files representing this documentation set are part of CNrun project,
and covered by GPL-2+.
@end copying

@titlepage
@title CNrun Lua API
@subtitle version @value{VERSION}
@subtitle @value{UPDATED}
@author Andrei Zavada

@page
@vskip 0pt plus 1filll
@insertcopying
@end titlepage

@contents

@ifnottex
@node Top
@top CNrun Lua API
@comment  node-name,  next,  previous,  up
@insertcopying

This file documents the CNrun functions exposed in Lua.
@end ifnottex

@c The master menu, created with texinfo-master-menu, goes here.

@menu
* Introduction::  CNrun is a neuronal network model simulator, with
  scripting done in Lua.
* General notes::  Loading cnrun module in Lua; how errors are reported.
* Interpreter context::  CNrun interpreter context needs to be created
  first.
* Models::  Operations on neuronal network models: create, populate,
  simulate, etc.
* Individual units::  View/modify individual units' parameters and
  state variables.
* External excitation sources::  External excitation sources.
* Sampling state variables::  Ways to assess model state and behaviour.
* Unit species::  A table of all available built-in units.
* Planned features::  There are some, although only time permitting.
* Index::  All functions listed alphabetically.
@end menu

@node Introduction
@chapter Introduction

CNrun is a slow but precise neuronal network model simulator written in
C++, for which functions are exposed in the Lua scripting language.
These functions are described in this document.

In the present version (2.x), CNrun core is made into a shared library,
in contrast to CNrun 1.x which had it as a single executable
interpreting its own, very simple scripts.  In order to enable a more
competent scripting, with interesting possibilities such as network
plastic processes regulated by some model activity (excitation levels,
spike patterns, etc), wrappers are provided to call core functions from
Lua.

In the simplest case where you have a NeuroML-defined topology, a
simulation session could be as brief as this:

@example
local M = require("cnrun")
_, C = M.get_context()
_, M = M.new_model (C, "fafa")
M.import_nml (C, "fafa", "model.nml")
M.advance (C, "fafa", 1000)
@end example

@noindent
This snippet will create an interpreter context, create a model in it,
load an NML file, and advance the model one second.

To report a bug or request a wishlist item, go to @url{http://github.com/hmmr/cnrun}.

@node General notes
@chapter General notes

@section Preparations
 All functions are made available in @code{cnrun} module namespace, by
 means of standard @code{require}.  Thus, @code{local N = require("cnrun"); M.some_function(args)}.

@section Returned arguments
 On error, all functions return two arguments: first a @code{nil},
 and second, an error message describing what went wrong (a string).

 On success, the first returned argument will be 1 (integer), followed
 by one or more values specifically described in the following sections.
 Unless stated otherwise, functions which have nothing meaningful to
 return, on success return @code{1, model_name}.

@node Interpreter context
@chapter Interpreter context

In Lua, after loading the cnrun module with @code{require("cnrun")},
the first step is to get an interpreter context.  It is an opaque light
user data object, which you should pass as the first argument to all
subsequent calls to CNrun functions.

You can create and keep multiple models in a context, modify and advance
them independently.

The function to create a CNrun context is @code{get_context()}:

@defun get_context ()
  Create a CNrun interpreter context, in which all subsequent operations
  will be performed.

  On success, returns the newly created context object @var{C} as the
  second argument.
@end defun

@defun drop_context (C)
  Drop the interpreter context @var{C}, previously obtained with
  @emph{get_context()}.
@end defun

In the following sections, context is passed as the first (or only)
argument to all functions.  It is denoted as @var{C} and not described
each time.

@node Models
@chapter Models

Multiple models can be created, accessed, modified, advanced within a
single interpreter context.  Models are identified by a label (a string).

@section Creating and deleting models

@defun new_model (C, M)
  Create a model named @var{M} (model label).
@end defun

@defun delete_model (C, M)
  Delete model @var{M}.
@end defun

@defun list_models (C)
  List models existing in context @var{C}, returned as strings.
@end defun

@section Populating models

Models can be populated by constituent neurons and synapses in two ways:
@enumerate
@item importing topology from a file (@code{import_nml()});
@item adding individual units one by one (@code{new_neuron()}, @code{new_synapse()}).
@end enumerate

@defun import_nml (C, M, file_name)
  Import network topology from a file (@var{file_name}) into a model
  named @var{M}.
@end defun

@defun export_nml (C, M, file_name)
  Export network topology of model @var{M} into file @var{file_name}.
@end defun

@defun new_neuron (C, M, type, label)
  Create a neuron of type @var{type}, with this @var{label}, in model
  @var{M}.  @var{label} must be of the form ``population.id''.
@end defun

@defun new_synapse (C, M, type, source, target)
  Create a synapse of this @var{type} connecting neurons labelled
  @var{source} and @var{target}.
@end defun

@section Other operations on models as a whole

@defun reset_model (C, M)
  Reset the state of all units, rewind all periodic sources and flush
  and close any logs in model @var{M}.
@end defun

@defun cull_deaf_synapses (C, M)
  Remove all synapses with a zero @var{gsyn}, in model @var{M}.  This
  makes sense unless you are going to modify @var{gsyn} at a later time.
@end defun

@defun describe_model (C, M)
  Describe model @var{M}.  The output will be printed to stdout and look
  like this:
@verbatim
Model "FAFA":
     13 units total (7 Neurons, 6 Synapses):
       11 hosted,
        2 standalone
        0 discrete dt-bound
      0 Listening units
      0 Spikelogging neurons
      0 Units being tuned continuously
      0 Units being tuned periodically
      2 Spontaneously firing neurons
      2 Multiplexing synapses
     26 vars on integration vector
@end verbatim
@end defun

@defun advance (C, M, duration)
  Run simulation in model @var{M} for @var{duration} milliseconds.
@end defun

@defun advance_until (C, M, time)
  Run simulation in model @var{M} until point in time @var{time}.

  Note that the real eventual model time after this function has
  returned may be a little (less than the last @var{dt}) greater than
  expected.
@end defun


@section Model parameters

Each model has the following parameters that affect its behaviour:

@table @emph
@item verbosely
Level of verbosity of printed messages (integer, 0 up to 6).

@item integration_dt_min
Lower bound for @var{dt} (float).

@item integration_dt_max
Upper bound for @var{dt} (float).

@item integration_dt_cap
Maximal factor by which @var{dt} can be allowed to increase in
consecutive iterations (float).

@item listen_dt
A time increment between consecutive sampling and logging of state
variables (float).

@item listen_mode
A string of symbols defining unit `listening' mode, of the form
@emph{x}@{@emph{-}@}, where @emph{x} indicates the mode and @emph{-},
whether to disable that mode (if given, else enable).  There are three
modes: @var{1}, whether to log the first state variable only, or all
unit vars; @var{d}, whether to defer writing until end of simulation;
and @var{b}, whether to write FP values in native machine representation
instead of @code{"%g"}.

@item sxf_start_delay
Length of time, before and after sampling point, limiting the extent of
counting spikes for sdf/sxf evaluation (float).  Leave at 0 to count
all spikes from 0 until current model time; a couple of seconds should
be good for reasonable accuracy.

@item sxf_period
Sampling period for sdf and shf (spike density and spike heterogeneity)
functions.

@item sdf_sigma
Parameter @var{sigma} in sdf (float).
@end table

@defun get_model_parameter (C, M, P)
  Get a model parameter @var{P}, one of those listed above.
@end defun

@defun set_model_parameter (C, M, P, V)
  Set a model parameter @var{P} to value @var{V}.
@end defun


@node Individual units
@chapter Individulal unit identification, properties and parameters

Units populating a model are uniquely identified by their label, set at
creation time.  Where a single unit needs to be selected for a function,
the corresponding argument to that function is designated @var{L}.  Or,
if an operation is supposed to affect many units, selected by a regex
pattern on their labels, that argument is designated @var{R}.

Apart from the (arbitrary) label, units are classified as belonging to
either Neuron or Synapse class, further belonging to a certain family
and species.  These categories are built-in, each species defining a
set of parameters and state variables.

All available unit species are listed in @ref{Unit species}.

@defun get_unit_properties (C, M, L)
  Return the following attributes and properties of unit @var{L}, in
  order: @emph{label}, @emph{class_name}, @emph{family}, @emph{species},
  as strings, followed by flags @emph{has_sources} and
  @emph{is_not_altered}.
@end defun

@defun get_unit_parameter (C, M, L, P)
  Get the value of unit @var{L}'s parameter @var{P}.
@end defun

@defun set_unit_parameter (C, M, L, P, V)
  Set unit @var{L}'s parameter @var{P} to a value of @var{V}.
@end defun

@defun get_unit_vars (C, M, L)
  Get the values of all state variables of unit @var{L}, returned as
  floats in the order they are listed in @ref{Unit species} table.
@end defun

@defun reset_unit (C, M, L)
  Reset all state variables of unit @var{L}.
@end defun

@defun get_units_matching (C, M, R)
  Return all units with labels matching regex @var{R}.
@end defun

@defun get_units_of_type (C, M, sp)
  Return all units of a species @var{sp}.
@end defun

@defun set_matching_neuron_parameter (C, M, R, P, V)
  Set the value of parameter @var{P} to @var{V} in all neurons labelled
  matching regex @var{R}.
@end defun

@defun set_matching_synapse_parameter (C, M, Rs, Rt, P, V)
  Set the value of parameter @var{P} to @var{V} in all synapses
  connecting, resp., any neurons labelled matching regexes @var{Rs} and
  @var{Rt}.
@end defun

@defun revert_matching_unit_parameters (C, M, R)
  Revert to defaults all parameters of units labelled matching regex
  @var{R}.
@end defun

@defun decimate (C, M, R, frac)
  Delete a random @var{frac} of all units matching regex @var{R}.
@end defun

@defun putout (C, M, R)
  Delete all units matching regex @var{R}.
@end defun


@node External excitation sources
@chapter External excitation sources

CNrun provides for three types of external stimulation sources:

@itemize @bullet
@item @emph{Tape},
 with all possible values defined in sequence, with timestamps, in a
 file, optionally looping.
@item @emph{Periodic},
 with a sequence of values defined to occur at regular intervals within
 a specified period.
@item @emph{Noise},
 a continuous sampling from a uniform or gaussian distribution.
@end itemize

@defun new_tape_source (C, M, source_name, file_name, looping)
  Set up a new tape source named @var{source_name}, from data in file
  @var{file_name}.
@end defun

@defun new_periodic_source (C, M, source_name, file_name, looping, period)
  Set up a new periodic source named @var{source_name}, from data in file
  @var{file_name}, optionally @var{looping} over a @var{period} (stuck
  at the last value, if not).
@end defun

@defun new_noise_source (C, M, source_name, min, max, sigma, distribution)
  Set up a new noise source named @var{source_name}, of a given
  @var{distribution} (possible values are @emph{"uniform"} and
  @emph{"gaussian"}), with given @var{min}, @var{max}, and (for the
  gaussian) @var{sigma}.
@end defun

@defun get_sources (C, M)
  Get all sources created in the model, returning labels as strings.
@end defun

@defun connect_source (C, M, L, P, source_name)
  Connect source @var{source_name} to parameter @var{P} of unit @var{L}.
@end defun

@defun disconnect_source (C, M, L, P, source_name)
  Disconnect a previously connected source @var{source_name} from
  parameter @var{P} of unit @var{L}.
@end defun


@node Sampling state variables
@chapter Sampling state variables

In addition to direct access to unit state variables in Lua (see
@code{get_unit_parameter()}), there are two ways to record unit state
for offline assessment:

@itemize @bullet

@item Have units write their state variable(s) to logs (created in the
current directory with unit labels with a suffix ``.vars'' as names);

@item Have neurons record the times of spikes (written to files
similarly named except with suffix ``.spikes'').

@end itemize

@defun start_listen (C, M, R)
  Enable logging of state variables (with options as defined in model
  parameter @var{listen_mode} @ref{Models}) in all units
  labelled matching regex @var{R}.  Return the count of units affected.
@end defun

@defun stop_listen (C, M, R)
  Disable logging of state variables in all units labelled matching
  regex @var{R}.  Units writing logs will flush and close.  Return
  the count of units affected.
@end defun

@defun start_log_spikes (C, M, R)
  Enable logging of spike times in all neurons labelled matching regex
  @var{R}.  Return the count of units affected.
@end defun

@defun stop_log_spikes (C, M, R)
  Disable logging spikes in all units labelled matching regex @var{R}.
  Return the count of units affected.
@end defun


@node Unit species
@chapter Unit species

Given below are some unit species available for your models.  For a
complete list, with parameter standard values and descriptions, see the
output of @code{dump_available_units()}.

@section Neuron species
@multitable @columnfractions .1 .25 .15 .5
@headitem Species @tab Parameters @tab State vars @tab Description

@item HH
@tab gNa, ENa, gK, EK, gl, El, Cmem, Idc
@tab E
@tab A classical, conductance-based Hodgkin-Huxley neuron

@item HHRate
@tab a, I0, r, Idc
@tab F
@tab  Rate-based model of the Hodgkin-Huxley neuron

@item DotPoisson
@tab lambda, Vrst, Vfir
@tab E
@tab Duration-less spike Poisson oscillator

@item Poisson
@tab lambda, trel, trel+trfr, Vrst, Vfir
@tab E
@tab  Poisson oscillator

@item VdPol
@tab eta, omegasq
@tab A
@tab  Van der Pol oscillator

@item DotPulse
@tab f, Vrst, Vfir
@tab E
@tab  Dot Pulse generator

@item NMap
@tab Vspike, alpha, gamma, beta, Idc
@tab E
@tab Map neuron

@end multitable

@section Synapse species

In addition to parameters listed in the table, each synapse has a
conductance (parameter @var{gsyn}).

@multitable @columnfractions .1 .25 .15 .5
@headitem Synapse @tab Parameters @tab State vars @tab Description

@item AB
@tab Esyn, Epre, alpha, beta, trel
@tab S
@tab An alpha-beta synapse (Destexhe, Mainen, Sejnowsky, 1994)

@item Rall
@tab Esyn, Epre, tau
@tab S, R
@tab Rall synapse (Rall, 1967)

@item Map
@tab tau, delta, Vrev
@tab S
@tab Map synapse

@end multitable



@node Planned features
@chapter Planned features

Interconnections, both control as well as direct, asynchronous stimuli
transduction, between external peer CNrun nodes.

@node Index
@unnumbered @code{CNrun Lua API} Function index

@printindex fn

@bye