@orama/tokenizers
v3.0.4
Published
Additional tokenizers for Orama
Downloads
1,379
Readme
Orama Tokenizers
This package provides support for additional tokenizers for the Orama Search Engine.
Available tokenizers:
- Chinese (Mandarin, experimental)
- Japanese (experimental)
- Korean (work in progress)
Usage:
import { create } from '@orama/orama'
import { createTokenizer } from '@orama/tokenizers/mandarin'
const db = await create({
schema: {
myProperty: 'string',
anotherProperty: 'number'
},
components: {
tokenizer: await createTokenizer()
}
})