File: limit-token-count-tokenfilter.asciidoc

package info (click to toggle)
elasticsearch 1.0.3%2Bdfsg-5
  • links: PTS, VCS
  • area: main
  • in suites: jessie-kfreebsd
  • size: 37,220 kB
  • sloc: java: 365,486; xml: 1,258; sh: 714; python: 505; ruby: 354; perl: 134; makefile: 41
file content (32 lines) | stat: -rw-r--r-- 1,057 bytes parent folder | download | duplicates (2)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
[[analysis-limit-token-count-tokenfilter]]
=== Limit Token Count Token Filter

Limits the number of tokens that are indexed per document and field.

[cols="<,<",options="header",]
|=======================================================================
|Setting |Description
|`max_token_count` |The maximum number of tokens that should be indexed
per document and field. The default is `1`

|`consume_all_tokens` |If set to `true` the filter exhaust the stream
even if `max_token_count` tokens have been consumed already. The default
is `false`.
|=======================================================================

Here is an example:

[source,js]
--------------------------------------------------
index :
    analysis :
        analyzer :
            myAnalyzer :
                type : custom
                tokenizer : standard
                filter : [lowercase, five_token_limit]
        filter :
            five_token_limit :
                type : limit
                max_token_count : 5
--------------------------------------------------