12

Kibana / Opensearch Dashboards: Error when creating new index pattern (Bad Requ...

 1 year ago
source link: https://blog.davidvassallo.me/2023/08/09/kibana-opensearch-dashboards-error-when-creating-new-index-pattern-bad-request/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

This article is based on internal documentation from CyberSift, and since this seems to be a problem which others might run into, I’ve posted it here

Problem scenario

When attempting to (re-)create a new index pattern from Kibana / Opensearch Dashboards, we encounter the error Bad Request:

image.png?w=949

Troubleshooting steps

First point of call is the kibana/dashboard logs. Tailing these logs while attempting to re-create the issue shows a line similar to the below:

{
   "type":"log",
   "@timestamp":"2023-08-09T09:15:19Z",
   "tags":[
      "error",
      "opensearch",
      "data"
   ],
   "pid":96684,
   "message":"[illegal_argument_exception]: Document contains at least one immense term in field=\"index-pattern.fields\" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped.  Please correct the analyzer to not produce such terms.  The prefix of the first immense term is: '[91, 123, 34, 99, 111, 117, 110, 116, 34, 58, 48, 44, 34, 110, 97, 109, 101, 34, 58, 34, 64, 109, 101, 116, 97, 100, 97, 116, 97, 46]...', original message: bytes can be at most 32766 in length; got 50603"
}

The log entry is a bit cryptic so it’s worth giving a short background:

  • Whenever an index pattern is created via the UI, in the background a new elastic/opensearch document is created in the hidden index .kibana
  • So this means we are actually looking at an elastic/opensearch error. The document creation is failing due to the ïmmense field”

Rummaging around google we find a very good explanation: https://github.com/elastic/kibana/issues/9352

So essentially elastic/opensearch is applying the “keyword” analyzer to the “index-pattern.fields” field in the .kibana index. And the keyword analyzer only accepts up to a certain size. If the index for which you are creating the index pattern has a high number of fields (winlogbeat in our case…) you\d end up with this error. The fix is to change the mapping to “text” rather than keyword

Fixing the issue

  • Check if there are any elasticsearch templates which can affect the index. If you do find a template (as was our case), then modify this template instead of creating a new one in the step below
  • Create the template:
POST _template/kibana_template_mapping
{
    "order" : 9999991,
    "index_patterns" : [
      ".kibana*"
    ],
    "mappings" : {
      "properties": {
        "index-pattern.fields":{   #<---- this is the important part
          "type": "text"           #<---- this is the important part
        }
      }
  }

In the above we are forcing the field named “index-pattern.fields” to be treated as text, rather than as a keyword

  • Rename the original .kibana index to a temporary one. In elastic/opensearch this means:
    • re-index
    • delete
    • re-index
  • So re-indexing to a temporary index (.bak_kibana):
POST /_reindex
{
"source": {
  "index": ".kibana"
},
"dest": {
  "index": ".bak_kibana"
}
}
  • At this point, if you check the renamed index by issuing

    GET /.bak_kibana

    the mappings should already be correct:

image-1.png?w=266

  • Delete the original with
DELETE /.kibana
  • Last, re-index the temporary back to the original:
POST /_reindex
{
"source": {
  "index": ".bak_kibana"
},
"dest": {
  "index": ".kibana"
}
}

Retry the index creation – it works!🙂

Loading...

Related


Recommend

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK