You need to use edge_ngram along with char_filter, to achieve your use case
Adding a working example
Index Mapping:
{
"settings": {
"analysis": {
"analyzer": {
"my_analyzer": {
"tokenizer": "my_tokenizer",
"char_filter": [
"replace_whitespace"
]
}
},
"tokenizer": {
"my_tokenizer": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 10,
"token_chars": [
"letter",
"digit"
]
}
},
"char_filter": {
"replace_whitespace": {
"type": "mapping",
"mappings": [
"\u0020=>"
]
}
}
}
},
"mappings": {
"properties": {
"articlenumbers": {
"type": "text",
"fields": {
"analyzed": {
"type": "text",
"analyzer": "my_analyzer"
}
}
}
}
}
}
Index Data:
{
"articlenumbers": "AB 987 g567 323"
}
Search Query:
{
"query": {
"multi_match": {
"query": "AB987g",
"fields": [
"articlenumbers",
"articlenumbers.analyzed"
]
}
}
}
Search Result:
"hits": [
{
"_index": "65936531",
"_type": "_doc",
"_id": "1",
"_score": 1.4384104,
"_source": {
"articlenumbers": "AB 987 g567 323"
}
}
]
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…