Class AnalyzersDescriptor
Inheritance
AnalyzersDescriptor
Assembly: OpenSearch.Client.dll
Syntax
public class AnalyzersDescriptor : IsADictionaryDescriptorBase<AnalyzersDescriptor, IAnalyzers, string, IAnalyzer>, IDescriptor, IPromise<IAnalyzers>
Constructors
|
Edit this page
View Source
AnalyzersDescriptor()
Declaration
public AnalyzersDescriptor()
Methods
|
Edit this page
View Source
Custom(string, Func<CustomAnalyzerDescriptor, ICustomAnalyzer>)
An analyzer of type custom that allows to combine a Tokenizer with zero or more Token Filters,
and zero or more Char Filters.
The custom analyzer accepts a logical/registered name of the tokenizer to use, and a list of
logical/registered names of token filters.
Declaration
public AnalyzersDescriptor Custom(string name, Func<CustomAnalyzerDescriptor, ICustomAnalyzer> selector)
Parameters
Returns
|
Edit this page
View Source
Fingerprint(string, Func<FingerprintAnalyzerDescriptor, IFingerprintAnalyzer>)
An analyzer of type fingerprint that implements a fingerprinting algorithm which
is used by the OpenRefine project to assist in clustering.
Declaration
public AnalyzersDescriptor Fingerprint(string name, Func<FingerprintAnalyzerDescriptor, IFingerprintAnalyzer> selector = null)
Parameters
Returns
|
Edit this page
View Source
Icu(string, Func<IcuAnalyzerDescriptor, IIcuAnalyzer>)
Declaration
public AnalyzersDescriptor Icu(string name, Func<IcuAnalyzerDescriptor, IIcuAnalyzer> selector)
Parameters
Returns
|
Edit this page
View Source
Keyword(string, Func<KeywordAnalyzerDescriptor, IKeywordAnalyzer>)
An analyzer of type keyword that “tokenizes” an entire stream as a single token. This is useful for data like zip codes, ids and so on.
Note, when using mapping definitions, it make more sense to simply mark the field as not_analyzed.
Declaration
public AnalyzersDescriptor Keyword(string name, Func<KeywordAnalyzerDescriptor, IKeywordAnalyzer> selector = null)
Parameters
Returns
|
Edit this page
View Source
Kuromoji(string, Func<KuromojiAnalyzerDescriptor, IKuromojiAnalyzer>)
An analyzer tailored for japanese that is bootstrapped with defaults.
Part of the analysis-kuromoji
plugin:
Declaration
public AnalyzersDescriptor Kuromoji(string name, Func<KuromojiAnalyzerDescriptor, IKuromojiAnalyzer> selector = null)
Parameters
Returns
|
Edit this page
View Source
Language(string, Func<LanguageAnalyzerDescriptor, ILanguageAnalyzer>)
A set of analyzers aimed at analyzing specific language text.
Declaration
public AnalyzersDescriptor Language(string name, Func<LanguageAnalyzerDescriptor, ILanguageAnalyzer> selector)
Parameters
Returns
|
Edit this page
View Source
Nori(string, Func<NoriAnalyzerDescriptor, INoriAnalyzer>)
The nori analyzer consists of the following tokenizer and token filters:
- nori_tokenizer
- nori_part_of_speech token filter
- nori_readingform token filter
- nori_number token filter
- lowercase token filter
Declaration
public AnalyzersDescriptor Nori(string name, Func<NoriAnalyzerDescriptor, INoriAnalyzer> selector)
Parameters
Returns
|
Edit this page
View Source
Pattern(string, Func<PatternAnalyzerDescriptor, IPatternAnalyzer>)
An analyzer of type pattern that can flexibly separate text into terms via a regular expression.
Declaration
public AnalyzersDescriptor Pattern(string name, Func<PatternAnalyzerDescriptor, IPatternAnalyzer> selector)
Parameters
Returns
|
Edit this page
View Source
Simple(string, Func<SimpleAnalyzerDescriptor, ISimpleAnalyzer>)
An analyzer of type simple that is built using a Lower Case Tokenizer.
Declaration
public AnalyzersDescriptor Simple(string name, Func<SimpleAnalyzerDescriptor, ISimpleAnalyzer> selector = null)
Parameters
Returns
|
Edit this page
View Source
Snowball(string, Func<SnowballAnalyzerDescriptor, ISnowballAnalyzer>)
An analyzer of type snowball that uses the standard tokenizer, with standard filter, lowercase filter, stop filter, and snowball filter.
The Snowball Analyzer is a stemming analyzer from Lucene that is originally based on the snowball project from
snowball.tartarus.org.
Declaration
public AnalyzersDescriptor Snowball(string name, Func<SnowballAnalyzerDescriptor, ISnowballAnalyzer> selector)
Parameters
Returns
|
Edit this page
View Source
Standard(string, Func<StandardAnalyzerDescriptor, IStandardAnalyzer>)
An analyzer of type standard that is built of using Standard Tokenizer, with Standard Token Filter, Lower Case Token Filter, and Stop Token
Filter.
Declaration
public AnalyzersDescriptor Standard(string name, Func<StandardAnalyzerDescriptor, IStandardAnalyzer> selector)
Parameters
Returns
|
Edit this page
View Source
Stop(string, Func<StopAnalyzerDescriptor, IStopAnalyzer>)
An analyzer of type stop that is built using a Lower Case Tokenizer, with Stop Token Filter.
Declaration
public AnalyzersDescriptor Stop(string name, Func<StopAnalyzerDescriptor, IStopAnalyzer> selector)
Parameters
Returns
|
Edit this page
View Source
UserDefined(string, IAnalyzer)
Declaration
public AnalyzersDescriptor UserDefined(string name, IAnalyzer analyzer)
Parameters
Returns
|
Edit this page
View Source
Whitespace(string, Func<WhitespaceAnalyzerDescriptor, IWhitespaceAnalyzer>)
An analyzer of type whitespace that is built using a Whitespace Tokenizer.
Declaration
public AnalyzersDescriptor Whitespace(string name, Func<WhitespaceAnalyzerDescriptor, IWhitespaceAnalyzer> selector = null)
Parameters
Returns
Implements
Extension Methods