WARNING: Version 0.90 of Elasticsearch has passed its EOL date.
This documentation is no longer being maintained and may be removed. If you are running this version, we strongly advise you to upgrade. For the latest information, see the current release documentation.
Normalization Token Filter
editNormalization Token Filter
editThere are several token filters available which try to normalize special characters of a certain language.
You can currently choose between arabic_normalization
and
persian_normalization
normalization in your token filter
configuration. For more information check the
ArabicNormalizer
or the
PersianNormalizer
documentation.
Note: These filters are available since 0.90.2