Formatting Synonyms

(from github.com/MajidSafari)
Hi again
i tried to add Synonyms in dictionary, for single part words like “test=>exam” it works fine but for more than one part words with spaces like “online test=>exam” it doesn’t work.
the line for this rule in the “synonym.txt” is:
online exam,online test=>exam

did i do something wrong ?

(from github.com/marevol)
If you use standard tokenizer and synonym token filter, standard tokenizer is divides “online test” into online and test. So, “online test” does not match in synonym token filter.
I think it works if using ngram_synonym tokenizer .
https://github.com/codelibs/elasticsearch-analysis-synonym

(from MajidSafari (Majid ) · GitHub)
hi

i add this code in fess.json

“tokenizer”: {
“2gram_synonym”:{
“type”:“ngram_synonym”,
“n”:“2”,
“synonyms_path”:“${fess.dictionary.path}synonym.txt”}
}

and

“persian_analyzer”: {
“type”: “custom”,
“tokenizer”: “2gram_synonym”,
“char_filter”: [ “zero_width_spaces” ],
“filter”: [
“truncate20_filter”,
“lowercase”,
“arabic_normalization”,
“persian_normalization”,
“persian_stop”
]
},

but not working

plz help me

(from github.com/marevol)
Did you re-create fess index?

(from github.com/MajidSafari)
yes

(from github.com/marevol)
Hmm…
This is not Fess problem.
To solve it, I think that enough knowledge for Lucene analysis is needed…

(from github.com/MajidSafari)
@marevol

i cannot solve this , plz help to me ,

sample code or something more ,

(from github.com/marevol)
I have no time to support a specific issue other than Fess…
If you need a quick support, please contact Commercial Support: http://www.n2sm.net/en/support/fess_support.html

(from github.com/MajidSafari)
Ok tnx

(from MajidSafari (Majid ) · GitHub)
i solved
by

“persian_analyzer”: {
“type”: “custom”,
“tokenizer”: “unigram_synonym_tokenizer”,
“char_filter”: [ “zero_width_spaces” ],
“filter”: [
“truncate20_filter”,
“lowercase”,
“arabic_normalization”,
“persian_normalization”,
“persian_stop”
]
},

exam => online test its work

but online test => exam not working

i try to solve this , tanx