Elasticsearch Query DSL

The queries that we saw until now were basic commands that were used to retrieve data, but the actual power of Elasticsearch's querying lies in a robust Query Domain Specific Language based on JSON also called Query DSL. Kibana makes extensive use of Query DSL in order to get results in a desired format for you. You almost never really have to worry about writing the query JSON, as Kibana will automatically create and put the results in a nice format.

For example, in order to get only three results out of all the matching ones, we can specify it like this:

curl -XPOST 'localhost:9200/logstash-*/_search' -d '
{
  "query": { "match_all": {} },
  "size": 3
}'

The response is as follows, which contains three documents matching the search:

{
  "took" : 390,
  "timed_out" : false,
  "_shards" : {
    "total" : 640,
    "successful" : 640,
    "failed" : 0
  },
  "hits" : {
    "total" : 128,
    "max_score" : 1.0,
    "hits" : [{
        "_index" : "logstash-2014.07.01",
        "_type" : "logs",
        "_id" : "AU2qge3cPoayDyQnreX0",
        "_score" : 1.0,
        "_source" : {
          "message" : ["2014-07-02,583.3526,585.44269,580.39264,582.33765,1056400,582.33765"],
          "@version" : "1",
          "@timestamp" : "2014-07-01T23:00:00.000Z",
          "host" : "packtpub",
          "path" : "/opt/logstash/input/GOOG.csv",
          "date_of_record" : "2014-07-02",
          "open" : 583.3526,
          "high" : 585.44269,
          "low" : 580.39264,
          "close" : 582.33765,
          "volume" : 1056400,
          "adj_close" : "582.33765"
        }
      }, {
        "_index" : "logstash-2014.07.09",
        "_type" : "logs",
        "_id" : "AU2qge3cPoayDyQnreXv",
        "_score" : 1.0,
        "_source" : {
          "message" : ["2014-07-10,565.91254,576.59265,565.01257,571.10254,1356700,571.10254"],
          "@version" : "1",
          "@timestamp" : "2014-07-09T23:00:00.000Z",
          "host" : "packtpub",
          "path" : "/opt/logstash/input/GOOG.csv",
          "date_of_record" : "2014-07-10",
          "open" : 565.91254,
          "high" : 576.59265,
          "low" : 565.01257,
          "close" : 571.10254,
          "volume" : 1356700,
          "adj_close" : "571.10254"
        }
      }, {
        "_index" : "logstash-2014.07.21",
        "_type" : "logs",
        "_id" : "AU2qgZixPoayDyQnreXn",
        "_score" : 1.0,
        "_source" : {
          "message" : ["2014-07-22,590.72266,599.65271,590.60266,594.74268,1699200,594.74268"],
          "@version" : "1",
          "@timestamp" : "2014-07-21T23:00:00.000Z",
          "host" : "packtpub",
          "path" : "/opt/logstash/input/GOOG.csv",
          "date_of_record" : "2014-07-22",
          "open" : 590.72266,
          "high" : 599.65271,
          "low" : 590.60266,
          "close" : 594.74268,
          "volume" : 1699200,
          "adj_close" : "594.74268"
        }
      }
    ]
  }
}

Similarly, the query to get results sorted by a field will look similar to this:

curl -XPOST 'localhost:9200/logstash-*/_search' -d '
{
"query" : {
"match_all" :{}
},
"sort" : {"open" : { "order":"desc"}},
"size" :3
}'

You can see the response of the preceding query, sorted by the "open" field in a desc manner:

{
  "took" : 356,
  "timed_out" : false,
  "_shards" : {
    "total" : 640,
    "successful" : 640,
    "failed" : 0
  },
  "hits" : {
    "total" : 128,
    "max_score" : null,
    "hits" : [{
        "_index" : "logstash-2014.07.23",
        "_type" : "logs",
        "_id" : "AU2qgZixPoayDyQnreXl",
        "_score" : null,
        "_source" : {
          "message" : ["2014-07-24,596.4527,599.50269,591.77271,593.35266,1035100,593.35266"],
          "@version" : "1",
          "@timestamp" : "2014-07-23T23:00:00.000Z",
          "host" : "packtpub",
          "path" : "/opt/logstash/input/GOOG.csv",
          "date_of_record" : "2014-07-24",
          "open" : 596.4527,
          "high" : 599.50269,
          "low" : 591.77271,
          "close" : 593.35266,
          "volume" : 1035100,
          "adj_close" : "593.35266"
        },
        "sort" : [596.4527]
      }, {
        "_index" : "logstash-2014.09.21",
        "_type" : "logs",
        "_id" : "AU2qgZioPoayDyQnreW8",
        "_score" : null,
        "_source" : {
          "message" : ["2014-09-22,593.82269,593.95166,583.46271,587.37262,1689500,587.37262"],
          "@version" : "1",
          "@timestamp" : "2014-09-21T23:00:00.000Z",
          "host" : "packtpub",
          "path" : "/opt/logstash/input/GOOG.csv",
          "date_of_record" : "2014-09-22",
          "open" : 593.82269,
          "high" : 593.95166,
          "low" : 583.46271,
          "close" : 587.37262,
          "volume" : 1689500,
          "adj_close" : "587.37262"
        },
        "sort" : [593.82269]
      }, {
        "_index" : "logstash-2014.07.22",
        "_type" : "logs",
        "_id" : "AU2qgZixPoayDyQnreXm",
        "_score" : null,
        "_source" : {
          "message" : ["2014-07-23,593.23267,597.85266,592.50269,595.98267,1233200,595.98267"],
          "@version" : "1",
          "@timestamp" : "2014-07-22T23:00:00.000Z",
          "host" : "packtpub",
          "path" : "/opt/logstash/input/GOOG.csv",
          "date_of_record" : "2014-07-23",
          "open" : 593.23267,
          "high" : 597.85266,
          "low" : 592.50269,
          "close" : 595.98267,
          "volume" : 1233200,
          "adj_close" : "595.98267"
        },
        "sort" : [593.23267]
      }
    ]
  }
}

Note

More details on Query DSL can be found at the Elasticsearch official documentation here:

https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl.html

Now when we have an understanding of Query DSL in Elasticsearch, let's look at one of the queries automatically created by Kibana, in our example from Chapter 2, Building Your First Data Pipeline with ELK.

Go to the Kibana Visualize page and open the Highest Traded Volume Visualization that we created earlier. If we click on the arrow button at the bottom, it opens up buttons for Request, Response like this:

Elasticsearch Query DSL

Elasticsearch Request Body on Kibana UI

Here, we can easily see the request sent by Kibana to Elasticsearch as Elasticsearch request body:

{
  "query": {
    "filtered": {
      "query": {
        "query_string": {
          "analyze_wildcard": true,
          "query": "*"
        }
      },
      "filter": {
        "bool": {
          "must": [
            {
              "range": {
                "@timestamp": {
                  "gte": 1403880285618,
                  "lte": 1419472695417
                }
              }
            }
          ],
          "must_not": []
        }
      }
    }
  },
  "size": 0,
  "aggs": {
    "1": {
      "max": {
        "field": "volume"
      }
    }
  }
}

The preceding query makes use of query filters to apply range on the @timestamp field, along with aggregations to find the maximum value of the "Volume" field. Similarly, we can also check for other visualizations created. Kibana takes care of creating queries for all the types of visualizations that you create.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset