1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69
|
Execution example::
tokenizer_list
# [
# [
# 0,
# 1337566253.89858,
# 0.000355720520019531
# ],
# [
# {
# "name": "TokenMecab"
# },
# {
# "name": "TokenDelimit"
# },
# {
# "name": "TokenUnigram"
# },
# {
# "name": "TokenBigram"
# },
# {
# "name": "TokenTrigram"
# },
# {
# "name": "TokenBigramSplitSymbol"
# },
# {
# "name": "TokenBigramSplitSymbolAlpha"
# },
# {
# "name": "TokenBigramSplitSymbolAlphaDigit"
# },
# {
# "name": "TokenBigramIgnoreBlank"
# },
# {
# "name": "TokenBigramIgnoreBlankSplitSymbol"
# },
# {
# "name": "TokenBigramIgnoreBlankSplitSymbolAlpha"
# },
# {
# "name": "TokenBigramIgnoreBlankSplitSymbolAlphaDigit"
# },
# {
# "name": "TokenDelimitNull"
# },
# {
# "name": "TokenRegexp"
# },
# {
# "name": "TokenNgram"
# },
# {
# "name": "TokenPattern"
# },
# {
# "name": "TokenTable"
# },
# {
# "name": "TokenDocumentVectorTFIDF"
# },
# {
# "name": "TokenDocumentVectorBM25"
# }
# ]
# ]
|