IllegalArgumentException - 仅支持< = 256个有限字符串

时间:2015-11-17 15:00:26

标签: elasticsearch

我在索引数据时遇到此错误。经过一些研究后,我发现为什么会发生这种情况,并且增加max_token_length所以我这样做了,但我仍然得到TokenStream expanded to 912 finite strings. Only <= 256 finite strings are supported

的相同错误

这是我的分析仪设置:

"settings": {
    "index": {
        "analysis": {
            "analyzer": {
                "shingle_analyzer": {
                    "tokenizer": "standard",
                    "max_token_length": 920,
                    "filter": ["lowercase", "shingle_filter", "asciifolding"],
                    "char_filter": ["html_strip"],
                    "type": "custom"
                },
                "html_analyzer": {
                    "tokenizer": "standard",
                    "max_token_length": 920,
                    "filter": ["lowercase", "asciifolding"],
                    "char_filter": ["html_strip"],
                    "type": "custom"
                }
            },
            "tokenizer": {
                "standard": {
                    "type": "standard"
                }
            },
            "filter": {
                "shingle_filter": {
                    "min_shingle_size": 2,
                    "max_shingle_size": 5,
                    "type": "shingle"
                }
            }
        }
    }
}

以下是我尝试插入的示例:

POST /my_index/my_type/{id}
{
    "myField":{
        "input":"Abcdefghij kl Mnopqrstwx yz Abcdef g Hijklmno pq Rstwxy Zabc (DEF)",
        "weight":2,
        "payload":{
            "iD":"2786129"
        }
    }
}

以下是my_type属性

的映射
"Suggestion": {
    "properties": {
        "id": {
            "index": "not_analyzed",
            "type": "integer"
        },
        "myField": {
            "type": "completion",
            "analyzer": "shingle_analyzer",
            "search_analyzer": "shingle_analyzer",
            "max_input_length": 150,
            "payloads": true
        }
    }
}

我错过了什么?

感谢您解决此问题的任何帮助或线索,谢谢!

修改 更正了analyzer封闭丢失的

0 个答案:

没有答案