Writing SQL Queries (v8.5)

Take your observability to the next level by visualizing exactly what’s vital for you. Create tailored charts by writing your own custom SQL queries.

Want a dashboard that works out of the box?

Use one of our ready-to-use templates to create a dashboard.

Getting started

Create your first tailored chart using a custom SQL query. Start by creating a dashboard in DashboardsCreate dashboard.

Create a new chart on your dashboard and use the query below:

SQL query
SELECT {{time}} AS time,
  countMerge(rows) AS value,
  level AS series
FROM {{source}}
WHERE time BETWEEN {{start_time}} AND {{end_time}}
GROUP BY time, series

Chart preview

  • Dashboard queries use ClickHouse SQL, which is largely similar to ANSI SQL you’re likely familiar with. In SQL query above, we use countMerge ClickHouse function.

  • On top of SQL, dashboard queries feature variables like {{source}} or {{time}}. Variables conveniently insert selected source, current time range, and other selected values into your queries. Read more about Read more about [query variables][docs-logs-using-logtail-dashboards-variables].

  • Chart queries don't query logs directly. Logs are aggregated into efficient metrics. Logs to metrics aggregations control which columns you can use in your queries. In SQL query above, we use level column.

Logs to metrics

Ingested logs and pure metrics are processed into aggregated metrics and columns you can group by.

Which columns can be used in SQL queries?

You can query metrics and group by columns extracted from your logs and pure metrics:

  • For Logs, we automatically extract a few common metrics and columns based on source type. You can add more columns to use in dashboard queries.

  • Pure metrics can always be filtered and grouped by name and tags. There are also aggregated value_* and rate_*. Read more about writing SQL queries for metrics.

Find the complete list of columns extracted from your logs and pure metrics in Sourcesyour sourceConfigureAdvanced settingsLogs to metrics section.

How to use metrics in queries?

Under Logs to metrics you will find aggregated metrics and columns you can group by:

  • Metrics: Aggregated values you can use in your queries. Use aggregation merge functions with matching metric suffix: avgMerge(flight_delay_avg). Check which aggregations are available for specific metrics in the Metrics table.

Need to count the number of rows?

Use countMerge(rows) to get the number of underlying log lines or metric data points. Note that you don't have to add _count suffix to rows metric.

  • Group by: Columns you can use for filtering, grouping, or anywhere else in your queries. Use them by their Column name just like regular database columns.

Screenshot of Logs to metrics tables

Check out the examples below to see how to use metrics and group by columns.

Query examples

Number of errors

Chart the number of logs with the error log level:

SQL query
SELECT {{time}} AS time, countMerge(rows) AS value
FROM {{source}}
WHERE time BETWEEN {{start_time}} AND {{end_time}}
  AND level = 'error'
GROUP BY time, series

Use countMerge(rows) to count the number of log lines. Using count(*) would give you wrong results.

Filter by the level column. The column has to be present in Logs to metricsGroup by.

Average request duration

Chart the average duration of request:

SQL query
SELECT {{time}} AS time,
  avgMerge(request_duration_avg) AS "Request duration"
FROM {{source}}
WHERE time BETWEEN {{start_time}} AND {{end_time}}
GROUP BY time, series

Use avgMerge(request_duration_avg) to calculate an average for the request_duration metric. The metric has to be added in Logs to metricsMetrics with avg aggregation.

Column names Request duration will be used in chart tooltips and the legend. Select which column should be charted on the Y-axis on the right side in Chart settingAxes.

Percentile request duration

Chart the 95th percentile request duration:

SQL query
SELECT {{time}} AS time,
  quantilesMerge(0.5, 0.9, 0.95, 0.99)(request_duration_quantiles)[3]
    AS "95th percentile duration"
FROM {{source}}
WHERE time BETWEEN {{start_time}} AND {{end_time}}
GROUP BY time, series

Here we have the request_duration metric with p50, p90, p95, and p99 percentiles set up in Logs to metrics. Calculate the 95th percentile request duration using quantilesMerge. The quantilesMerge function gives us all four percentiles. We select the third percentile using [3].

TODO: Filter by aggregated metrics using HAVING (?)

Adding metrics and group by columns

Want to use more fields in your queries?

Add metrics and columns to group by in Sourcesyour sourceConfigureAdvanced settingsLogs to metrics section.

Adding metrics

Add metrics for numeric fields in your logs that you want to chart. We'll process the fields into aggregated metrics.

Fill out the inputs:

  • Metric name: Name your metric.
  • SQL expression: Use JSONExtract() or any valid ClickHouse SQL expression.
    • For example, use JSONExtract(json, 'request', 'duration_ms', Nullable(Float64)) to extract request.duration_ms field from your logs.
    • Using Nullable(Float64) parses the value as float and prevents errors when the field is missing. See all ClickHouse data types.
  • Aggregations: Choose aggregations you want to use in your queries. You can select multiple aggregations. For example, use avg and p95 to get the average and 95th percentile values.

Screenshot of adding a new metric

Adding columns to group by

Add fields to the table that you want to filter and group by. We’ll extract the fields for use in dashboard queries.

Fill out the inputs:

  • Column name: Name your extracted column.
  • SQL expression: Use JSONExtract() or any valid ClickHouse SQL expression.
    • For example, use JSONExtract(json, 'context', 'system', 'hostname', Nullable(String)) to extract context.system.hostname field from your logs.
    • Using Nullable(String) parses the value as string and prevents errors when the field is missing. See all ClickHouse data types.
  • Codec: Choose type with compression codec based on data in your logs field. For example, use String (LowCardinality) for strings.

Screenshot of adding a new group by column