1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
|
tokenize TokenBigram "aBcDe 123" NormalizerNonexistent
[
[
[
-22,
0.0,
0.0
],
"[tokenize] failed to set normalizer: <NormalizerNonexistent>: [info][set][normalizers][(anonymous)] failed to parse normalizers"
]
]
#|e| [expr][parse] unknown identifier: <NormalizerNonexistent>
#|e| Syntax error: <NormalizerNonexistent||>: [expr][parse] unknown identifier: <NormalizerNonexistent>
#|e| [info][set][normalizers][(anonymous)] failed to parse normalizers: <NormalizerNonexistent>: Syntax error: <NormalizerNonexistent||>: [expr][parse] unknown identifier: <NormalizerNonexistent>
#|e| [tokenize] failed to set normalizer: <NormalizerNonexistent>: [info][set][normalizers][(anonymous)] failed to parse normalizers: <NormalizerNonexistent>: Syntax error: <NormalizerNonexisten
|