定製自己的分詞器
阿新 • • 發佈:2019-01-28
standard tokenizer:以單詞邊界進行切分
standard token filter:什麼都不做
lowercase token filter:將所有字母轉換為小寫
stop token filer(預設被禁用):移除停用詞,比如a the it等等
定製化自己的分詞器
PUT /my_index
{
"settings": {
"analysis": {
"char_filter": {
"&_to_and": {
"type": "mapping",
"mappings": ["&=> and"]
}
},
"filter": {
"my_stopwords": {
"type": "stop",
"stopwords": ["the", "a"]
}
},
"analyzer": {
"my_analyzer": {
"type": "custom",
"char_filter": ["html_strip", "&_to_and"],
"tokenizer": "standard",
"filter": ["lowercase", "my_stopwords"]
}
}
}
}
}