4. Analytics queries with DingoQL

Now that we have DingoQL setup and connected to our database we can start querying database. DingoQL makes analytics queries extremely easy to run. Before we start we will first familize ourselves with API Explorer in DingoQL.

API Explorer

API explorer is an IDE like environment provided in DingoQL's admin panel. Once we log into DingoQL's admin panel we can go to API explorer by selecting "API Explorer" from top navigation.

API-explorer

This API explorer is a Graphiql playground. Graphiql is a popular GraphQL IDE. Here we have integrated Graphiql IDE so that we can conveniently create develop DingoQL queries.

In left panel of API explorer we can write our queries. To test this out, we can copy and paste the following query in left side panel and click the red colored play / execute button to run this query.

{
  q {
    sampledb {
      public {
        orders(metric: { operator: SUM, field: freight }) {
          jsonData
        }
      }
    }
  }
}

We will see JSON response in right panel. In a later tutorial we will learn, how this json response will be consumed by your application to display different types of Graphs.

API explorer also has a bottom panel which shows the response in a tabular manner.

API Explorer - Bottom Panel

The bottom panel of API explorer also contains a "Schema and Security" tab where we can view fields available in table we are querying.

API Explorer - Schema and Security

Now that we have an understanding of DingoQL's API explorer we will learn how we can write analytics queries.

Queries in DingoQL

We will go through few examples of queries on a table orders which contains the following columns.

Column nameData type
freightFloat
shipCityString
orderIdInteger
customerIdChar
employeeIdInteger
orderDateTimestamp
requiredDateTimestamp
shippedDateTimestamp
shipViaInteger
shipNameInteger
shipAddressString
shipCityString
shipRegionString
shipPostalCodeString
shipCountryString

Example 1 : Simple Queries

All analytics queries in DingoQL starts at root q.

Then we have to pick a datasource, which for this case is sampledb. Then we pick a database's schema. For this example schema is public.

In the last step we will pick a table we want to query. For this example we will select orders table.

Here orders is also graphql query name. Like any graphql query we will also have to specify response fields we want from this query. So we specify response fields jsonData and dataTypes and we get following query.

{
  q {
    sampledb {
      public {
        orders {
          jsonData,
          dataTypes
        }
      }
    }
  }
}

But this analytics query is not valid because we have to specify how to query orders table. For this example, we can pick two fields shippedDate and frieght. This will give us a two-dimensional dataset - shipName vs freight.

{
  q {
    sampledb {
      public {
        orders(fields: [ shipName, freight ]) {
          jsonData,
          dataTypes
        }
      }
    }
  }
}

Example 2 : Query with Aggregate / Metric query

Let's say we want to find sum of freight in all orders. We can do that by setting input field metric as follows.

{
  q {
    sampledb {
      public {
        orders(metric: { operator: SUM, field: freight }) {
          jsonData,
          dataTypes
        }
      }
    }
  }
}

Here we are specifying that we want to execute a SUM query on frieght field.

Next we will add another dimension to this query by setting one more field to this query as follows.

{
  q {
    sampledb {
      public {
        orders(metric: { operator: SUM, field: freight }, fields: [ shipCity ]) {
          jsonData,
          dataTypes
        }
      }
    }
  }
}

This gives a dataset of sum of frieght per shipCity.

Example 3: Time Series Queries

Plotting a metric against time is probably the most common use case in analytics. For example we might want to plot average freight by day. This query can be made by querying for field shippedDate_DAY as follows.

{
  q {
    sampledb {
      public {
        orders(metric: { operator: AVG, field: freight },
        fields: [ shippedDate_DAY ]) {
          jsonData,
          dataTypes
        }
      }
    }
  }
}

We can change the unit of time-axis by changing the postfix of this query to _MONTH or _YEAR etc.

{
  q {
    sampledb {
      public {
        orders(metric: { operator: AVG, field: freight },
        fields: [ shippedDate_MONTH ]) {
          jsonData,
          dataTypes
        }
      }
    }
  }
}

We can also add more fields. For example we might want average freight for each day by ship city of order.

{
  q {
    sampledb {
      public {
        orders(
          metric: { operator: AVG, field: freight },
          fields: [ shippedDate_DAY, shipCity ]) {
          jsonData,
          dataTypes
        }
      }
    }
  }
}

Example 4: Filters in analytics query

Filters are also extremely simple with DingoQL. Filters can be defined by setting filters field. Let's say we want sum of all frieghts from city "Madrid". We can do that with following query.

{
  q {
    sampledb {
      public {
        orders(metric: { operator: SUM, field: freight },
        filters: {
          shipCity_EQ: "Madrid"
        }) {
          jsonData,
          dataTypes
        }
      }
    }
  }
}

Here we are setting field shipCity_EQ in filters object. shipCity is field name and _EQ is comparison operator equal to.

Let's say we want sum of all frieght from 'Madrid' which are greater than 50. We can do that by setting field freight_GT in filters input. Here postfix _GT stands for greater than operator.

{
  q {
    sampledb {
      public {
        orders(metric: { operator: SUM, field: freight },
        filters: {
          shipCity_EQ: "Madrid",
          freight_GT: 50
        }) {
          jsonData,
          dataTypes
        }
      }
    }
  }
}

I hope this tutorial gives a basic idea about how to write analytics queries in DingoQL.

With DingoQL we can write more complex analytics queries. To know about all the querying capabilities in DingoQL please refer to documentation.

Coming Soon: Server integration