Search Results for

    Show / Hide Table of Contents

    Interface ICustomAnalyzer

    An analyzer of type custom that allows to combine a Tokenizer with zero or more Token Filters, and zero or more Char Filters.

    The custom analyzer accepts a logical/registered name of the tokenizer to use, and a list of logical/registered names of token filters.

    Inherited Members
    IAnalyzer.Type
    IAnalyzer.Version
    Namespace: OpenSearch.Client
    Assembly: OpenSearch.Client.dll
    Syntax
    public interface ICustomAnalyzer : IAnalyzer

    Properties

    | Edit this page View Source

    CharFilter

    The logical / registered name of the tokenizer to use.

    Declaration
    [DataMember(Name = "char_filter")]
    IEnumerable<string> CharFilter { get; set; }
    Property Value
    Type Description
    IEnumerable<string>
    | Edit this page View Source

    Filter

    An optional list of logical / registered name of token filters.

    Declaration
    [DataMember(Name = "filter")]
    IEnumerable<string> Filter { get; set; }
    Property Value
    Type Description
    IEnumerable<string>
    | Edit this page View Source

    PositionIncrementGap

    When indexing an array of text values, OpenSearch inserts a fake "gap" between the last term of one value and the first term of the next value to ensure that a phrase query doesn’t match two terms from different array elements. Defaults to 100.

    Declaration
    [DataMember(Name = "position_increment_gap")]
    int? PositionIncrementGap { get; set; }
    Property Value
    Type Description
    int?
    | Edit this page View Source

    Tokenizer

    An optional list of logical / registered name of char filters.

    Declaration
    [DataMember(Name = "tokenizer")]
    string Tokenizer { get; set; }
    Property Value
    Type Description
    string

    Extension Methods

    SuffixExtensions.Suffix(object, string)
    • Edit this page
    • View Source
    In this article
    • Properties
      • CharFilter
      • Filter
      • PositionIncrementGap
      • Tokenizer
    • Extension Methods
    Back to top Generated by DocFX