Skip to content

Commit

Permalink
sql support for dynamic parameters (apache#6974)
Browse files Browse the repository at this point in the history
* sql support for dynamic parameters

* fixup

* javadocs

* fixup from merge

* formatting

* fixes

* fix it

* doc fix

* remove druid fallback self-join parameterized test

* unused imports

* ignore test for now

* fix imports

* fixup

* fix merge

* merge fixup

* fix test that cannot vectorize

* fixup and more better

* dependency thingo

* fix docs

* tweaks

* fix docs

* spelling

* unused imports after merge

* review stuffs

* add comment

* add ignore text

* review stuffs
  • Loading branch information
clintropolis authored Feb 19, 2020
1 parent e7eb45e commit b408a6d
Show file tree
Hide file tree
Showing 41 changed files with 2,103 additions and 157 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -220,7 +220,7 @@ public void querySql(Blackhole blackhole) throws Exception
final Map<String, Object> context = ImmutableMap.of("vectorize", vectorize);
final AuthenticationResult authenticationResult = NoopEscalator.getInstance()
.createEscalatedAuthenticationResult();
try (final DruidPlanner planner = plannerFactory.createPlanner(context, authenticationResult)) {
try (final DruidPlanner planner = plannerFactory.createPlanner(context, ImmutableList.of(), authenticationResult)) {
final PlannerResult plannerResult = planner.plan(QUERIES.get(Integer.parseInt(query)));
final Sequence<Object[]> resultSequence = plannerResult.run();
final Object[] lastRow = resultSequence.accumulate(null, (accumulated, in) -> in);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@

package org.apache.druid.benchmark.query;

import com.google.common.collect.ImmutableList;
import org.apache.calcite.schema.SchemaPlus;
import org.apache.druid.benchmark.datagen.BenchmarkSchemaInfo;
import org.apache.druid.benchmark.datagen.BenchmarkSchemas;
Expand Down Expand Up @@ -165,9 +166,9 @@ public void queryNative(Blackhole blackhole)
@OutputTimeUnit(TimeUnit.MILLISECONDS)
public void queryPlanner(Blackhole blackhole) throws Exception
{
final AuthenticationResult authenticationResult = NoopEscalator.getInstance()
.createEscalatedAuthenticationResult();
try (final DruidPlanner planner = plannerFactory.createPlanner(null, authenticationResult)) {
final AuthenticationResult authResult = NoopEscalator.getInstance()
.createEscalatedAuthenticationResult();
try (final DruidPlanner planner = plannerFactory.createPlanner(null, ImmutableList.of(), authResult)) {
final PlannerResult plannerResult = planner.plan(sqlQuery);
final Sequence<Object[]> resultSequence = plannerResult.run();
final Object[] lastRow = resultSequence.accumulate(null, (accumulated, in) -> in);
Expand Down
49 changes: 46 additions & 3 deletions docs/querying/sql.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,8 @@ like `100` (denoting an integer), `100.0` (denoting a floating point value), or
timestamps can be written like `TIMESTAMP '2000-01-01 00:00:00'`. Literal intervals, used for time arithmetic, can be
written like `INTERVAL '1' HOUR`, `INTERVAL '1 02:03' DAY TO MINUTE`, `INTERVAL '1-2' YEAR TO MONTH`, and so on.

Druid SQL supports dynamic parameters in question mark (`?`) syntax, where parameters are bound to the `?` placeholders at execution time. To use dynamic parameters, replace any literal in the query with a `?` character and ensure that corresponding parameter values are provided at execution time. Parameters are bound to the placeholders in the order in which they are passed.

Druid SQL supports SELECT queries with the following structure:

```
Expand Down Expand Up @@ -518,6 +520,17 @@ of configuration.
You can make Druid SQL queries using JSON over HTTP by posting to the endpoint `/druid/v2/sql/`. The request should
be a JSON object with a "query" field, like `{"query" : "SELECT COUNT(*) FROM data_source WHERE foo = 'bar'"}`.

##### Request
|Property|Type|Description|Required|
|--------|----|-----------|--------|
|`query`|`String`| SQL query to run| yes |
|`resultFormat`|`String` (`ResultFormat`)| Result format for output | no (default `"object"`)|
|`header`|`Boolean`| Write column name header for supporting formats| no (default `false`)|
|`context`|`Object`| Connection context map. see [connection context parameters](#connection-context)| no |
|`parameters`|`SqlParameter` list| List of query parameters for parameterized queries. | no |


You can use _curl_ to send SQL queries from the command-line:

```bash
Expand All @@ -540,7 +553,27 @@ like:
}
```

Metadata is available over the HTTP API by querying [system tables](#metadata-tables).
Parameterized SQL queries are also supported:

```json
{
"query" : "SELECT COUNT(*) FROM data_source WHERE foo = ? AND __time > ?",
"parameters": [
{ "type": "VARCHAR", "value": "bar"},
{ "type": "TIMESTAMP", "value": "2000-01-01 00:00:00" }
]
}
```

##### SqlParameter

|Property|Type|Description|Required|
|--------|----|-----------|--------|
|`type`|`String` (`SqlType`) | String value of `SqlType` of parameter. [`SqlType`](https://calcite.apache.org/avatica/javadocAggregate/org/apache/calcite/avatica/SqlType.html) is a friendly wrapper around [`java.sql.Types`](https://docs.oracle.com/javase/8/docs/api/java/sql/Types.html?is-external=true)|yes|
|`value`|`Object`| Value of the parameter|yes|


Metadata is also available over the HTTP API by querying [system tables](#metadata-tables).

#### Responses

Expand Down Expand Up @@ -617,8 +650,7 @@ try (Connection connection = DriverManager.getConnection(url, connectionProperti
```

Table metadata is available over JDBC using `connection.getMetaData()` or by querying the
["INFORMATION_SCHEMA" tables](#metadata-tables). Parameterized queries (using `?` or other placeholders) don't work properly,
so avoid those.
["INFORMATION_SCHEMA" tables](#metadata-tables).

#### Connection stickiness

Expand All @@ -630,6 +662,17 @@ the necessary stickiness even with a normal non-sticky load balancer. Please see

Note that the non-JDBC [JSON over HTTP](#json-over-http) API is stateless and does not require stickiness.

### Dynamic Parameters

You can also use parameterized queries in JDBC code, as in this example;

```java
PreparedStatement statement = connection.prepareStatement("SELECT COUNT(*) AS cnt FROM druid.foo WHERE dim1 = ? OR dim1 = ?");
statement.setString(1, "abc");
statement.setString(2, "def");
final ResultSet resultSet = statement.executeQuery();
```

### Connection context

Druid SQL supports setting connection parameters on the client. The parameters in the table below affect SQL planning.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -186,7 +186,12 @@ public void testComputingSketchOnNumericValues() throws Exception
+ "FROM foo";

// Verify results
final List<Object[]> results = sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = sqlLifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();
final List<String[]> expectedResults = ImmutableList.of(
new String[] {
"\"AAAAAT/wAAAAAAAAQBgAAAAAAABAaQAAAAAAAAAAAAY/8AAAAAAAAD/wAAAAAAAAP/AAAAAAAABAAAAAAAAAAD/wAAAAAAAAQAgAAAAAAAA/8AAAAAAAAEAQAAAAAAAAP/AAAAAAAABAFAAAAAAAAD/wAAAAAAAAQBgAAAAAAAA=\""
Expand Down Expand Up @@ -219,7 +224,12 @@ public void testDefaultCompressionForTDigestGenerateSketchAgg() throws Exception
+ "FROM foo";

// Log query
sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
sqlLifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();

// Verify query
Assert.assertEquals(
Expand Down Expand Up @@ -248,7 +258,12 @@ public void testComputingQuantileOnPreAggregatedSketch() throws Exception
+ "FROM foo";

// Verify results
final List<Object[]> results = sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = sqlLifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();
final List<double[]> expectedResults = ImmutableList.of(
new double[] {
1.1,
Expand Down Expand Up @@ -297,7 +312,12 @@ public void testGeneratingSketchAndComputingQuantileOnFly() throws Exception
+ "FROM (SELECT dim1, TDIGEST_GENERATE_SKETCH(m1, 200) AS x FROM foo group by dim1)";

// Verify results
final List<Object[]> results = sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = sqlLifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();
final List<double[]> expectedResults = ImmutableList.of(
new double[] {
1.0,
Expand Down Expand Up @@ -363,7 +383,12 @@ public void testQuantileOnNumericValues() throws Exception
+ "FROM foo";

// Verify results
final List<Object[]> results = sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = sqlLifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();
final List<double[]> expectedResults = ImmutableList.of(
new double[] {
1.0,
Expand Down Expand Up @@ -410,7 +435,12 @@ public void testCompressionParamForTDigestQuantileAgg() throws Exception
+ "FROM foo";

// Log query
sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
sqlLifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();

// Verify query
Assert.assertEquals(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -221,7 +221,12 @@ public void testApproxCountDistinctHllSketch() throws Exception
+ "FROM druid.foo";

// Verify results
final List<Object[]> results = sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = sqlLifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();
final List<Object[]> expectedResults;

if (NullHandling.replaceWithDefault()) {
Expand Down Expand Up @@ -334,7 +339,12 @@ public void testAvgDailyCountDistinctHllSketch() throws Exception
+ ")";

// Verify results
final List<Object[]> results = sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = sqlLifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();
final List<Object[]> expectedResults = ImmutableList.of(
new Object[]{
1L
Expand Down Expand Up @@ -430,7 +440,8 @@ public void testApproxCountDistinctHllSketchIsRounded() throws Exception
+ " HAVING APPROX_COUNT_DISTINCT_DS_HLL(m1) = 2";

// Verify results
final List<Object[]> results = sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results =
sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, DEFAULT_PARAMETERS, authenticationResult).toList();
final int expected = NullHandling.replaceWithDefault() ? 1 : 2;
Assert.assertEquals(expected, results.size());
}
Expand All @@ -457,7 +468,12 @@ public void testHllSketchPostAggs() throws Exception
+ "FROM druid.foo";

// Verify results
final List<Object[]> results = sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = sqlLifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();
final List<Object[]> expectedResults = ImmutableList.of(
new Object[]{
"\"AgEHDAMIAgDhUv8P63iABQ==\"",
Expand Down Expand Up @@ -605,7 +621,12 @@ public void testtHllSketchPostAggsPostSort() throws Exception
final String sql2 = StringUtils.format("SELECT HLL_SKETCH_ESTIMATE(y), HLL_SKETCH_TO_STRING(y) from (%s)", sql);

// Verify results
final List<Object[]> results = sqlLifecycle.runSimple(sql2, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = sqlLifecycle.runSimple(
sql2,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();
final List<Object[]> expectedResults = ImmutableList.of(
new Object[]{
2.000000004967054d,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -223,7 +223,12 @@ public void testQuantileOnFloatAndLongs() throws Exception
+ "FROM foo";

// Verify results
final List<Object[]> results = sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = sqlLifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();
final List<Object[]> expectedResults = ImmutableList.of(
new Object[]{
1.0,
Expand Down Expand Up @@ -303,7 +308,12 @@ public void testQuantileOnComplexColumn() throws Exception
+ "FROM foo";

// Verify results
final List<Object[]> results = lifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = lifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();
final List<Object[]> expectedResults = ImmutableList.of(
new Object[]{
1.0,
Expand Down Expand Up @@ -362,7 +372,12 @@ public void testQuantileOnInnerQuery() throws Exception
+ "FROM (SELECT dim2, SUM(m1) AS x FROM foo GROUP BY dim2)";

// Verify results
final List<Object[]> results = sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = sqlLifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();
final List<Object[]> expectedResults;
if (NullHandling.replaceWithDefault()) {
expectedResults = ImmutableList.of(new Object[]{7.0, 11.0});
Expand Down Expand Up @@ -431,7 +446,12 @@ public void testQuantileOnInnerQuantileQuery() throws Exception
+ "FROM (SELECT dim1, dim2, APPROX_QUANTILE_DS(m1, 0.5) AS x FROM foo GROUP BY dim1, dim2) GROUP BY dim1";


final List<Object[]> results = sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = sqlLifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();

ImmutableList.Builder<Object[]> builder = ImmutableList.builder();
builder.add(new Object[]{"", 1.0});
Expand Down Expand Up @@ -512,7 +532,12 @@ public void testDoublesSketchPostAggs() throws Exception
+ "FROM foo";

// Verify results
final List<Object[]> results = sqlLifecycle.runSimple(sql, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = sqlLifecycle.runSimple(
sql,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();
final List<Object[]> expectedResults = ImmutableList.of(
new Object[]{
6L,
Expand Down Expand Up @@ -679,7 +704,12 @@ public void testDoublesSketchPostAggsPostSort() throws Exception
final String sql2 = StringUtils.format("SELECT DS_GET_QUANTILE(y, 0.5), DS_GET_QUANTILE(y, 0.98) from (%s)", sql);

// Verify results
final List<Object[]> results = sqlLifecycle.runSimple(sql2, QUERY_CONTEXT_DEFAULT, authenticationResult).toList();
final List<Object[]> results = sqlLifecycle.runSimple(
sql2,
QUERY_CONTEXT_DEFAULT,
DEFAULT_PARAMETERS,
authenticationResult
).toList();
final List<Object[]> expectedResults = ImmutableList.of(
new Object[]{
4.0d,
Expand Down
Loading

0 comments on commit b408a6d

Please sign in to comment.