Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

添加同义词之后,ik_smart 返回 ik_max_word 的分词 #1032

Open
LvChengbin opened this issue Nov 21, 2023 · 0 comments
Open

添加同义词之后,ik_smart 返回 ik_max_word 的分词 #1032

LvChengbin opened this issue Nov 21, 2023 · 0 comments

Comments

@LvChengbin
Copy link

LvChengbin commented Nov 21, 2023

我自定义了两个 analyzer,一个使用 ik_max_word 最为 tokenizer,另一个使用 ik_smart 作为 tokenizer,其他设置完全相同,大概如下:

{
    analyzer : {
            ik_analyzer_synonyms_max_word : {
                type : 'custom',
                tokenizer : 'ik_max_word',
                filter : [
                    'ik_synonyms_graph_filter'
                ]
            },
            ik_analyzer_synonyms_smart : {
                type : 'custom',
                tokenizer : 'ik_smart',
                filter : [
                    'ik_synonyms_graph_filter'
                ]
            }
    }
}

对于同一个 Query,ik_smartik_max_word 的分词结果是差别很大的,但是使用我上面定义的两个 analyzer 时,分词结果是相同的,看起来都是使用 ik_max_word 分词之后再映射的同义词。

是我的配置有问题么?

=================

补充一下,不完全一样,使用 ik_smart 的分词结果还是更精炼一些,但是还是把一些词给分的更细了,暂时没看出来到底是什么逻辑。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant