Discussion thread | - | ||||||||
---|---|---|---|---|---|---|---|---|---|
Vote thread | - | ||||||||
JIRA | - | ||||||||
https://lists.apache.org/thread/d1owrg8zh77v0xygcpb93fxt0jpjdkb3 | |||||||||
Vote thread | https://lists.apache.org/thread/7jbmg22lnww31sbfdzztwrzgm6bkhjrj | ||||||||
JIRA |
| ||||||||
Release | 1.18.0 | Release | - |
Please keep the discussion on the mailing list rather than commenting on the wiki (wiki discussions get unwieldy fast).
...
draw.io Board Diagram | ||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
There are 6 7 main classes in Flink Jdbc Driver: FlinkDriver, FlinkDataSource, FlinkConnection, FlinkStatement, FlinkResultSet, FlinkDatabaseMetaData and FlinkResultSetMetaData which implement jdbc interface Driver, DataSource, Connection, Statement, ResultSet, DatabaseMetaData and ResultSetMetaData.
- FlinkDriver parses gateway address from url, and creates FlinkConnection
- FlinkDataSource manages connection pool for flink jdbc, it will create specific count of connections and choose one for client directly.
- FlinkConnection creates Executor according to gateway address. When the Connections is closed, it can close the connection with gateway by Executor
- FlinkStatement can get Executor from FlinkConnection, and submit sql query to it. After query is executed, FlinkStatement can get StatementResult from Executor, and create FlinkResultSet
- FlinkResultSet is an iterator, it gets results from StatementResult and return them to users
- FlinkDatabaseMetaData provides meta data of catalogs, databases and tables
- FlinkResultSetMetaData provides meta data of ResultSet such as columns
...
draw.io Board Diagram | ||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
User Case
Flink Jdbc Driver module will be packaged into an independent jar file such as flink-table-jdbc-driver-{version}.jar
, which will contains classes of flink jdbc driver and shaded flink classes such as data type. User only need to add jdbc dependency in pom.xml or add the jar in the classpath of external Jdbc Tools such as sqlite.
Code Block | ||
---|---|---|
| ||
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-jdbc-driver</artifactId>
<version>${flink.version}</version>
</dependency> |
More information about Flink Jdbc Driver
|
There're Session
s and Operation
s in SqlGateway
. SqlGateway
will open a Session
for each FlinkConnection
, and then do multiple Operation
s in one Session
. When users create a FlinkConnection
by FlinkDriver
with SqlGateway
, it will open an exist or a new Session
. Any time users want to issue SQL statements to the database, they require a FlinkStatement
instance from FlinkConnection
. Once users have a FlinkStatement
, they can use issue a query. This will return a FlinkResultSet
instance, which contains the entire result. Each operation such as the execution query(Flink job), fetching results in FlinkResultSet
will be an Operation
in the Session
of SqlGateway
.
User Case
Flink Jdbc Driver module will be packaged into an independent jar file such as flink-table-jdbc-driver-{version}.jar
, which will contains classes of flink jdbc driver and shaded flink classes such as data type. User only need to add jdbc dependency in pom.xml or add the jar in the classpath of external Jdbc Tools such as sqlite.
Code Block | ||
---|---|---|
| ||
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-jdbc-driver</artifactId>
<version>${flink.version}</version>
</dependency> |
More information about Flink Jdbc Driver
- Driver Name: org.apache.flink.table.jdbc.FlinkDriver
- Flink Connection URL: Users can use default catalog and database directly, or set them in url. Users can also set custom parameters in url to open the session in SqlGateway with key1=val1&key2=val2&...
- Driver Name: org.apache.flink.table.jdbc.FlinkDriver
- Flink Connection URL: Users can use default catalog and database directly, or set them in url. Users can also set custom parameters in url to open the session in SqlGateway with key1=val1&key2=val2&...
- jdbc:flink://{sql-gateway.host}:{sql-gateway.port}?key1=val1&key2=val2
- jdbc:flink://{sql-gateway.host}:{sql-gateway.port}/{catalog name}?key1=val1&key2=val2
- jdbc:flink://{sql-gateway.host}:{sql-gateway.port}/{catalog name}/{database name}?key1=val1&key2=val2
- Currently SqlGateway does not support authentication,
user
andpassword
in connection are invalid. Use Flink Jdbc Driver in Java code
Code Block String driverName = "org.apache.flink.table.jdbc.FlinkDriver"; String url = "language java - jdbc:flink://{sql-gateway.host}:{sql-gateway.port}?key1=val1&key2=val2
- jdbc:flink://{sql-gateway.host}:{sql-gateway.port}/{catalog
- name}?key1=val1&key2=val2
- jdbc:flink://{sql-gateway.host}:{sql-gateway.port}/{catalog name}/{database name}?key1=val1&key2=val2
- Currently SqlGateway does not support authentication,
user
andpassword
in connection are invalid. Use Flink Jdbc Driver in Java code
Code Block language java String driverName = "org.apache.flink.table.jdbc.FlinkDriver"; String url = "jdbc:flink://{sql-gateway.host}:{sql-gateway.port}/{catalog name}/{database name}?key1=val1&key2=val2"; Class.forName(driverName); try (Connection connection = DriverManager.getConnection(url)) { try (Statement statement = connection.createStatement()) { try ("; Class.forName(driverName); try (Connection connection = DriverManager.getConnection(url)) { try (Statement statement = connection.createStatement()) { try (ResultSet resultSet = statement.execute("SELECT * FROM {Your Table}")) { while (resultSet.hasNext()) { Do your work .... } } try (ResultSet resultSet } }
- User can also add the
flink-table-jdbc-driver-{version}.jar
to the classpath of external jdbc tools.
Data Types
The following basic flink data types are supported to convert to sql data type in this FLIP, and more types can be supported as needed in the future.
...
Flink Data Type
...
Jdbc Sql Data Type
...
CharType/VarCharType
...
String
...
BooleanType
...
Boolean
...
TinyIntType
...
Byte
...
SmallIntType
...
Short
...
IntType
...
Int
...
BigIntType
...
Long
...
FloatType
...
Float
...
DoubleType
...
Double
...
DecimalType
...
BigDecimal
...
BinaryType/VarBinaryType
...
(byte[]) Bytes
...
DateType
...
Date
...
TimeType
...
Time
...
TimestampType
...
Timestamp
...
ArrayType
...
Array
= statement.execute("SELECT * FROM T1 JOIN T2 on T1.id=T2.id ...")) { while (resultSet.hasNext()) { Do your work ... } } } }
- User can also add the
flink-table-jdbc-driver-{version}.jar
to the classpath of external jdbc tools.
Data Types
The following basic flink data types are supported to convert to sql data type in this FLIP, and more types can be supported as needed in the future.
Flink Data Type | Java Sql Data Type | Java Data Type |
CharType/VarCharType | CHAR/VARCHAR | String |
BooleanType | BOOLEAN | Boolean |
TinyIntType | TINYINT | Byte |
SmallIntType | SMALLINT | Short |
IntType | INTEGER | Int |
BigIntType | BIGINT | Long |
FloatType | FLOAT | Float |
DoubleType | DOUBLE | Double |
DecimalType | DECIMAL | BigDecimal |
BinaryType/VarBinaryType | BINARY/VARBINARY | byte[] |
DateType | DATE | Date |
TimeType | TIME | Time |
TimestampType | TIMESTAMP | Timestamp |
ZonedTimestampType | TIMESTAMP_WITH_TIMEZONE | OffsetDateTime |
LocalZonedTimestampType | TIMESTAMP_WITH_LOCAL_TIMEZONE | Timestamp |
ArrayType | ARRAY | Array |
RowType | ROW(Not in java.sql.Types) | Row(Flink Row Data) |
MapType | MAP | Map<K, V> |
Currently TIMESTAMP_WITH_LOCAL_TIMEZONE is not exist in java.sql.Types, but it is supported by Flink
. Users can define a field as (f TIMESTAMP(p) WITH LOCAL TIME ZONE)
and set time zone through the connection parameters or dynamic parameters in console by table.local-time-zone
. After that, users can get Timestamp
which will be automatically converted from stored time data into specific Timestamp
according to the given time zone.
Java Sql Interfaces
There are many methods in Jdbc Driver, while this FLIP only implement the basic methods first and more methods will be implemented later when they are needed.
- Methods in
FlinkDriver
Public Interface
There are many methods in Jdbc Driver, while this FLIP only implement the basic methods first and more methods will be implemented later when they are needed.
- Methods in
FlinkDriver
Code Block | ||
---|---|---|
| ||
/* Jdbc Driver for flink sql gateway */
public class FlinkDriver implements Driver {
/* Connect sql gateway with given url and open/create session with given priperties. */
@Override
public Connection connect(String url, Properties info) throws SQLException;
} |
- Methods in
FlinkConnection
Code Block | ||
---|---|---|
| ||
/* Connection to flink sql gateway for jdbc driver. */
public class FlinkConnection implements Connection {
/* Create statement from connection. */
@Override
public Statement createStatement() throws SQLException;
/* Close session in sql gateway. */
@Override
public void close() throws SQLException;
/* Use given catalog to the session in sql gateway. */
@Override
public void setCatalog(String catalog) throws SQLException;
/* Get current catalog name from session. */
@Override
public String getCatalog() throws SQLException;
/* Get FlinkDatabaseMetaData instance for the current catalog. */
@Override
public DatabaseMetaData getMetaData() throws SQLException;
/* Use given database to the session in sql gateway. */
@Override
public void setSchema(String schema) throws SQLException;
/* Get current database name from session. */
@Override
public String getSchema() throws SQLException;
} |
- Methods in
FlinkStatement
Code Block | ||
---|---|---|
| ||
/* Jdbc StatementDriver infor flink jdbcsql drivergateway. */ public class FlinkStatement implements Statement { /* Submit sql to sql gateway and get result setOnly Batch Mode queries are supported. If you force to submit streaming queries, you may get unrecognized updates, deletions and other results in FlinkResultSet. */ public @Overrideclass FlinkDriver implements Driver { public ResultSet executeQuery(String /* Connect sql) throwsgateway SQLException; with given url and open/create /* Executesession with given update sql and return result countpriperties. */ @Override public intConnection executeUpdateconnect(String sqlurl, Properties info) throws SQLException; /* Cancel the running job in sql gateway} |
- Methods in FlinkDataSource
Code Block | ||
---|---|---|
| ||
/* Jdbc DataSource manages connections for client, we can support more operations in it in the future. */ @Override public void close() throws SQLException; /* Return true if the result set has more resultspublic class FlinkDataSource implements DataSource { /* The max count of connections which the data source holds. */ private int maxActive; /* Set the url of connection. */ @Override public boolean getMoreResults() throws SQLException; /* Get current result set in the statement. */ @Override public synchronized void setUrl(String url); /* Set the driver class name for the source. */ public synchronized void setDriverClassName(String driverClassName); /* Set the max active connection for the source. */ public synchronized ResultSetvoid getResultSetsetMaxActive(int maxActive); throws SQLException; } |
- Methods in
FlinkResultSet
:FlinkResultSet
only supports fetching data from iteratorStatementResult
, it supports getXXX methods and doesn't support deleting, updating or moving the cursor.
Code Block | ||
---|---|---|
| ||
/* ResultSet for flink jdbc driver /* Get a connection from data source. */ @Override public class FlinkResultSet implements ResultSetpublic { Connection getConnection() /* Return true if there are more resuts in result iterator. */throws SQLException; @Override public booleanConnection next(getConnection(String username, String password) throws SQLException; } |
- Methods in
FlinkConnection
Code Block | ||
---|---|---|
| ||
/* Connection to flink sql gateway for jdbc driver. */ public class FlinkConnection implements Connection { /* CloseCreate thestatement fetchfrom result operationconnection. */ @Override public voidStatement closecreateStatement() throws SQLException; /* GetClose differentsession valuesin according to data type and column indexsql gateway. */ @Override public <V>void V getXXX(int columnIndexclose() throws SQLException.; /* GetUse differentgiven values accordingcatalog to datathe typesession andin columnsql namegateway. */ @Override public <V>void V getXXXsetCatalog(String columnNamecatalog) throws SQLException. } |
- Methods in
FlinkDatabaseMetaData
:FlinkDatabaseMetaData
only supports TABLE and VIEW tables, getting information of catalog, database and tables.
Code Block | ||
---|---|---|
| ||
/* DatabaseMetaData in flink sql driver; /* Get current catalog name from session. */ @Override public class FlinkDatabaseMetaData implements DatabaseMetaData { String getCatalog() throws SQLException; /* Get theFlinkDatabaseMetaData urlinstance offor flinkthe sqlcurrent drivercatalog. */ @Override public StringDatabaseMetaData getURLgetMetaData() throws SQLException; /* GetUse cataloggiven namedatabase listto fromthe session in sql gateway. */ @Override public ResultSetvoid getCatalogssetSchema(String schema) throws SQLException; /* Get current database name list from session. */ @Override public ResultSetString getSchemasgetSchema() throws SQLException; } |
- Methods in
FlinkStatement
Code Block | ||
---|---|---|
| ||
/* Statement in flink jdbc driver. */ public class FlinkStatement implements Statement { /* GetSubmit databasesql nameto linssql ingateway givenand catalogget fromresult sessionset. */ @Override public ResultSet getSchemasexecuteQuery(String catalog, String schemaPatternsql) throws SQLException; /* GetExecute tablegiven nameupdate listsql withand givenreturn conditionresult from sessioncount. */ @Override public ResultSetint getTablesexecuteUpdate(String catalog, String schemaPattern, String tableNamePattern, String[] typessql) throws SQLException; /* GetCancel columnthe listrunning withjob givenin conditionsql from sessiongateway. */ @Override public ResultSetvoid getColumns(String catalog, String schemaPattern, String tableNamePattern, String columnNamePatternclose() throws SQLException; /* GetReturn primarytrue keyif listthe forresult givenset tablehas frommore sessionresults. */ @Override public ResultSetboolean getPrimaryKeys(String catalog, String schema, String table) throws SQLException; } |
- Methods in
FlinkResultSetMetaData
:FlinkResultSetMetaData
only supports getting column information according to column index or name.
Code Block | ||
---|---|---|
| ||
/* ResultSetMetaData in flink sql drivergetMoreResults() throws SQLException; /* Get current result set in the statement. */ public class FlinkResultSetMetaData implements ResultSetMetaData { @Override /*public GetResultSet columngetResultSet() throws SQLException; } |
- Methods in
FlinkResultSet
:FlinkResultSet
only supports fetching data from iteratorStatementResult
, it supports getXXX methods and doesn't support deleting, updating or moving the cursor. Compare withResultSet
, there isgetKind
method inFlinkResultSet
to get theRowKind
of current record.
Code Block | ||
---|---|---|
| ||
/* ResultSet for flink jdbc driver. Only Batch Mode queries are supported. If you force to submit streaming queries, you may get unrecognized updates, deletions and other results. */ public class FlinkResultSet implements ResultSet { /* Return true if there are more resuts in result iteratorcount in the result set. */ @Override public int getColumnCount() throws SQLException; /* If the column may be null. */ @Override public int isNullable(int column) throws SQLException; /* Get display size for the column. */ @Override public intboolean getColumnDisplaySizenext(int column) throws SQLException; /* GetClose columnthe namefetch according to column indexresult operation. */ @Override public Stringvoid getColumnLabel(int column)close() throws SQLException; public String getColumnName(int column) throws SQLException; /* Get different values /*according Getto precisiondata fortype theand column index. */ @Override public <V> intV getPrecisiongetXXX(int columncolumnIndex) throws SQLException;. /* Get different columnvalues according typeto iddata fortype theand column indexname. */ @Override public int<V> V getColumnTypegetXXX(intString columncolumnName) throws SQLException; /. } |
- Methods in
FlinkDatabaseMetaData
:FlinkDatabaseMetaData
only supports TABLE and VIEW tables, getting information of catalog, database and tables.
Code Block | ||
---|---|---|
| ||
/* DatabaseMetaData in flink sql driver. */ public class FlinkDatabaseMetaData implements DatabaseMetaData { /* Get the url of flink sql driver* Get column type name for the column index. */ @Override public String getColumnTypeNamegetURL(int column) throws SQLException; /* Get column type classcatalog name forlist thefrom column indexsession. */ @Override public public String getColumnClassName(int column) throws SQLException; } |
Unsupported Features
...
ResultSet getCatalogs() throws SQLException;
/* Get database name list from session. */
@Override
public ResultSet getSchemas() throws SQLException;
/* Get database name lins in given catalog from session. */
@Override
public ResultSet getSchemas(String catalog, String schemaPattern) throws SQLException;
/* Get table name list with given condition from session. */
@Override
public ResultSet getTables(String catalog, String schemaPattern, String tableNamePattern, String[] types) throws SQLException;
/* Get column list with given condition from session. */
@Override
public ResultSet getColumns(String catalog, String schemaPattern, String tableNamePattern, String columnNamePattern) throws SQLException;
/* Get primary key list for given table from session. */
@Override
public ResultSet getPrimaryKeys(String catalog, String schema, String table) throws SQLException;
} |
- Methods in
FlinkResultSetMetaData
:FlinkResultSetMetaData
only supports getting column information according to column index or name.
Code Block | ||
---|---|---|
| ||
/* ResultSetMetaData in flink sql driver. */
public class FlinkResultSetMetaData implements ResultSetMetaData {
/* Get column count in the result set. */
@Override
public int getColumnCount() throws SQLException;
/* If the column may be null. */
@Override
public int isNullable(int column) throws SQLException;
/* Get display size for the column. */
@Override
public int getColumnDisplaySize(int column) throws SQLException;
/* Get column name according to column index. */
@Override
public String getColumnLabel(int column) throws SQLException;
public String getColumnName(int column) throws SQLException;
/* Get precision for the column index. */
@Override
public int getPrecision(int column) throws SQLException;
/* Get column type id for the column index. */
@Override
public int getColumnType(int column) throws SQLException;
/* Get column type name for the column index. */
@Override
public String getColumnTypeName(int column) throws SQLException;
/* Get column type class name for the column index. */
@Override
public String getColumnClassName(int column) throws SQLException;
} |
Unsupported Features
- Don't support transaction such as commit, rollback
- Don't support prepare statement, prepare call and etc operations
- Don't support management operations such as savepoint and etc
Exception Handling
When an error occurs, Flink Jdbc Driver mainly throws the following exceptions
SQLState Class | SQLState SubClass | Reason | Exception | Operations |
---|---|---|---|---|
22 | 000 to 02H according to different errors | Description of data conversion error | SQLDataException | Get data error from ResultSet in methods getXXX |
0A | 000 | Specific feature is not supported | SQLFeatureNotSupportedException | All unimplemented methods will throw this exception |
58 | 004 | The exception or error message from Gateway | SQLNonTransientException | Gateway throws an exception or returns an error message when executing the query |
08 | 006 | The session is not exist in Gateway and client need to create new connection to it | SQLNonTransientConnectionException | Gateway is restarted and the client need to create new connection |
We can continue to subdivide and throw different exceptions according to the error information returned by the Gateway in Flink Jdbc Driver in the future
...
[1] https://github.com/ververica/flink-sql-gateway
...