elasticsearch search_analyzer没有应用过滤器

时间:2018-01-05 00:20:11

标签: elasticsearch

我无法让search_analyzer按预期工作。我希望它将指定的分析器应用于输入的搜索字符串,然后进行搜索。

我将索引分析器设置为 ngram tokenizer ,其中包含l owercase过滤器 char替换( - to _)

然后我希望搜索分析器小写替换( - 到_)

示例文档:{" filename":" AAA-BBB"}

应匹配的示例搜索:{"匹配":" AAA-b"}

相反,我必须使用搜索:{"匹配":" aaa_b"}

elasticsearch版本:5.5

映射:

{
    "settings": {
        "analysis": {
            "tokenizer": {
                "DO_tokenizer_ngram": {
                    "type": "ngram",
                    "min_gram": 2,
                    "max_gram": 30
                }
            },
            "analyzer": {
                "DO_analyzer_ngram": {
                    "tokenizer": "DO_tokenizer_ngram",
                    "char_filter": [
                        "do_char_filter_dashes"
                    ],
                    "filter": [
                        "lowercase"
                    ]
                },
                "DO_analyzer_search": {
                    "type": "keyword",
                    "char_filter": [
                        "do_char_filter_dashes"
                    ],
                    "filter": [
                        "lowercase"
                    ]
                }
            },
            "char_filter": {
                "do_char_filter_dashes": {
                    "type": "pattern_replace",
                    "pattern": "(\\d+)-(?=\\d)",
                    "replacement": "$1_"
                }
            }
        }
    },
    "mappings": {
        "images": {
            "properties": {
                "filename": {
                    "type": "text",
                    "analyzer": "DO_analyzer_ngram",
                    "search_analyzer": "DO_analyzer_search"
                }
            }
        }
    }
}

1 个答案:

答案 0 :(得分:0)

我是个白痴。搜索分析器需要:

"tokenizer": "keyword"

而不是

"type": "keyword",