Discussion threadhttps://lists.apache.org/thread/d1owrg8zh77v0xygcpb93fxt0jpjdkb3
Vote threadhttps://lists.apache.org/thread/7jbmg22lnww31sbfdzztwrzgm6bkhjrj
JIRA

Unable to render Jira issues macro, execution error.

Release1.18.0


Please keep the discussion on the mailing list rather than commenting on the wiki (wiki discussions get unwieldy fast).

Motivation

The old repositories flink-sql-gateway[1] and flink-jdbc-driver[2] support submitting query to the older version of flink cluster. But for the latest flink cluster there are many compatibility problems. Flink Sql Gateway has been a submodule in Flink, and the API has been greatly optimized. The old Flink JDBC Driver cannot connect to the new Gateway directly which will cause users cannot upgrade their Flink Clusters. In this FLIP, we'd like to introduce Flink Jdbc Driver module in Flink to connect the new Gateway, users can use Jdbc Driver to submit their queries and get results like a database in their applications.

Proposal

Architecture

Currently users can use Flink Sql Client to connect Sql Gateway as followed

SqlGateway starts Rest Service, and SqlClient can connect to it. Executor creates RestClient according to the address of Rest Service, then it can create Session, Submit Sql and Fetch Result according to RestClient.

We propose Flink Jdbc Driver interacts with SqlGateway through Executor too.

There are 7 main classes in Flink Jdbc Driver: FlinkDriver, FlinkDataSource, FlinkConnection, FlinkStatement, FlinkResultSet, FlinkDatabaseMetaData and FlinkResultSetMetaData which implement jdbc interface Driver, DataSource, Connection, Statement, ResultSet, DatabaseMetaData and ResultSetMetaData.

  1. FlinkDriver parses gateway address from url, and creates FlinkConnection
  2. FlinkDataSource manages connection pool for flink jdbc, it will create specific count of connections and choose one for client directly.
  3. FlinkConnection creates Executor according to gateway address. When the Connections is closed, it can close the connection with gateway by Executor
  4. FlinkStatement can get Executor from FlinkConnection, and submit sql query to it. After query is executed, FlinkStatement can get StatementResult from Executor, and create FlinkResultSet
  5. FlinkResultSet is an iterator, it gets results from StatementResult and return them to users
  6. FlinkDatabaseMetaData provides meta data of catalogs, databases and tables
  7. FlinkResultSetMetaData provides meta data of ResultSet such as columns

The call relationship between them are as followed.

There're Sessions and Operations in SqlGateway. SqlGateway will open a Session for each FlinkConnection , and then do multiple Operations in one Session. When users create a FlinkConnection by FlinkDriver with SqlGateway, it will open an exist or a new Session. Any time users want to issue SQL statements to the database, they require a FlinkStatement instance from FlinkConnection. Once users have a FlinkStatement , they can use issue a query. This will return a FlinkResultSet instance, which contains the entire result. Each operation such as the execution query(Flink job), fetching results in FlinkResultSet will be an Operation in the Session of SqlGateway .

User Case

Flink Jdbc Driver module will be packaged into an independent jar file such as flink-table-jdbc-driver-{version}.jar , which will contains classes of flink jdbc driver and shaded flink classes such as data type. User only need to add jdbc dependency in pom.xml or add the jar in the classpath of external Jdbc Tools such as sqlite.

<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-table-jdbc-driver</artifactId>
    <version>${flink.version}</version>
</dependency>

More information about Flink Jdbc Driver

  1. Driver Name: org.apache.flink.table.jdbc.FlinkDriver
  2. Flink Connection URL: Users can use default catalog and database directly, or set them in url. Users can also set custom parameters in url to open the session in SqlGateway with key1=val1&key2=val2&... 
    1. jdbc:flink://{sql-gateway.host}:{sql-gateway.port}?key1=val1&key2=val2
    2. jdbc:flink://{sql-gateway.host}:{sql-gateway.port}/{catalog name}?key1=val1&key2=val2
    3. jdbc:flink://{sql-gateway.host}:{sql-gateway.port}/{catalog name}/{database name}?key1=val1&key2=val2
  3. Currently SqlGateway does not support authentication, user  and password in connection are invalid.
  4. Use Flink Jdbc Driver in Java code

    String driverName = "org.apache.flink.table.jdbc.FlinkDriver";
    String url = "jdbc:flink://{sql-gateway.host}:{sql-gateway.port}/{catalog name}/{database name}?key1=val1&key2=val2";
    Class.forName(driverName);
    
    try (Connection connection = DriverManager.getConnection(url)) {
        try (Statement statement = connection.createStatement()) {
            try (ResultSet resultSet = statement.execute("SELECT * FROM {Your Table}")) {
                while (resultSet.hasNext()) {
                    Do your work ...
                }
            }
    
    		try (ResultSet resultSet = statement.execute("SELECT * FROM T1 JOIN T2 on T1.id=T2.id ...")) {
    			while (resultSet.hasNext()) {
                    Do your work ...
                }
    		}
        }
    }
  5. User can also add the flink-table-jdbc-driver-{version}.jar to the classpath of external jdbc tools.

Data Types

The following basic flink data types are supported to convert to sql data type in this FLIP, and more types can be supported as needed in the future.

Flink Data Type

Java Sql Data Type

Java Data Type

CharType/VarCharType

CHAR/VARCHAR

String

BooleanType

BOOLEAN

Boolean

TinyIntType

TINYINT

Byte

SmallIntType

SMALLINT

Short

IntType

INTEGER

Int

BigIntType

BIGINT

Long

FloatType

FLOAT

Float

DoubleType

DOUBLE

Double

DecimalType

DECIMAL

BigDecimal

BinaryType/VarBinaryType

BINARY/VARBINARY

byte[]

DateType

DATE

Date

TimeType

TIME

Time

TimestampType

TIMESTAMP

Timestamp

ZonedTimestampType

TIMESTAMP_WITH_TIMEZONE

OffsetDateTime

LocalZonedTimestampType

TIMESTAMP_WITH_LOCAL_TIMEZONE

Timestamp

ArrayType

ARRAY

Array

RowType

ROW(Not in java.sql.Types)

Row(Flink Row Data)

MapType

MAP

Map<K, V>

Currently TIMESTAMP_WITH_LOCAL_TIMEZONE is not exist in java.sql.Types, but it is supported by Flink. Users can define a field as (f TIMESTAMP(p) WITH LOCAL TIME ZONE) and set time zone through the connection parameters or dynamic parameters in console by table.local-time-zone. After that, users can get Timestamp which will be automatically converted from stored time data into specific Timestamp  according to the given time zone.

Java Sql Interfaces

There are many methods in Jdbc Driver, while this FLIP only implement the basic methods first and more methods will be implemented later when they are needed. 

  • Methods in FlinkDriver 
/* Jdbc Driver for flink sql gateway. Only Batch Mode queries are supported. If you force to submit streaming queries, you may get unrecognized updates, deletions and other results in FlinkResultSet. */
public class FlinkDriver implements Driver {
    /* Connect sql gateway with given url and open/create session with given priperties. */
	@Override
    public Connection connect(String url, Properties info) throws SQLException;
}
  • Methods in FlinkDataSource
/* Jdbc DataSource manages connections for client, we can support more operations in it in the future. */
public class FlinkDataSource implements DataSource {
	/* The max count of connections which the data source holds. */
	private int maxActive;

	/* Set the url of connection. */
	public synchronized void setUrl(String url);

	/* Set the driver class name for the source. */
	public synchronized void setDriverClassName(String driverClassName); 

	/* Set the max active connection for the source. */
    public synchronized void setMaxActive(int maxActive);

    /* Get a connection from data source. */
	@Override
    public Connection getConnection() throws SQLException;
 	@Override 
    public Connection getConnection(String username, String password) throws SQLException;
}
  • Methods in FlinkConnection 
/* Connection to flink sql gateway for jdbc driver. */
public class FlinkConnection implements Connection {
    /* Create statement from connection. */
	@Override
    public Statement createStatement() throws SQLException;
    
    /* Close session in sql gateway. */
 	@Override
    public void close() throws SQLException;
    
    /* Use given catalog to the session in sql gateway. */
 	@Override 
    public void setCatalog(String catalog) throws SQLException;
    
    /* Get current catalog name from session. */
 	@Override 
    public String getCatalog() throws SQLException;
    
    /* Get FlinkDatabaseMetaData instance for the current catalog. */
 	@Override 
    public DatabaseMetaData getMetaData() throws SQLException;
    
    /* Use given database to the session in sql gateway. */
 	@Override 
    public void setSchema(String schema) throws SQLException;
    
    /* Get current database name from session. */
 	@Override 
    public String getSchema() throws SQLException;
}
  • Methods in FlinkStatement 
/* Statement in flink jdbc driver. */
public class FlinkStatement implements Statement {
    /* Submit sql to sql gateway and get result set. */
 	@Override  
    public ResultSet executeQuery(String sql) throws SQLException;
    
    /* Execute given update sql and return result count. */
 	@Override  
    public int executeUpdate(String sql) throws SQLException;
    
    /* Cancel the running job in sql gateway. */
 	@Override  
    public void close() throws SQLException;
    
    /* Return true if the result set has more results. */
 	@Override  
    public boolean getMoreResults() throws SQLException;
    
    /* Get current result set in the statement. */
 	@Override
    public ResultSet getResultSet() throws SQLException;
}
  • Methods in FlinkResultSet : FlinkResultSet only supports fetching data from iterator StatementResult , it supports getXXX methods and doesn't support deleting, updating or moving the cursor. Compare with ResultSet , there is getKind method in FlinkResultSet to get the RowKind of current record.
/* ResultSet for flink jdbc driver. Only Batch Mode queries are supported. If you force to submit streaming queries, you may get unrecognized updates, deletions and other results. */
public class FlinkResultSet implements ResultSet {
    /* Return true if there are more resuts in result iterator. */
 	@Override 
    public boolean next() throws SQLException;
    
    /* Close the fetch result operation. */
 	@Override 
    public void close() throws SQLException;
    
    /* Get different values according to data type and column index. */
 	@Override 
    public <V> V getXXX(int columnIndex) throws SQLException.
    
    /* Get different values according to data type and column name. */
 	@Override
    public <V> V getXXX(String columnName) throws SQLException.
}
  • Methods in FlinkDatabaseMetaData : FlinkDatabaseMetaData only supports TABLE and VIEW tables, getting information of catalog, database and tables.
/* DatabaseMetaData in flink sql driver. */
public class FlinkDatabaseMetaData implements DatabaseMetaData {
    /* Get the url of flink sql driver. */
 	@Override 
    public String getURL() throws SQLException;
    
    /* Get catalog name list from session. */
 	@Override 
    public ResultSet getCatalogs() throws SQLException;
    
    /* Get database name list from session. */
 	@Override 
    public ResultSet getSchemas() throws SQLException;
    
    /* Get database name lins in given catalog from session. */
 	@Override 
    public ResultSet getSchemas(String catalog, String schemaPattern) throws SQLException;
    
    /* Get table name list with given condition from session. */
 	@Override 
    public ResultSet getTables(String catalog, String schemaPattern, String tableNamePattern, String[] types) throws SQLException;
    
    /* Get column list with given condition from session. */
 	@Override 
    public ResultSet getColumns(String catalog, String schemaPattern, String tableNamePattern, String columnNamePattern) throws SQLException;
    
    /* Get primary key list for given table from session. */
 	@Override 
    public ResultSet getPrimaryKeys(String catalog, String schema, String table) throws SQLException;
}
  • Methods in FlinkResultSetMetaData : FlinkResultSetMetaData only supports getting column information according to column index or name.
/* ResultSetMetaData in flink sql driver. */
public class FlinkResultSetMetaData implements ResultSetMetaData {
    /* Get column count in the result set. */
 	@Override 
    public int getColumnCount() throws SQLException;
    
    /* If the column may be null. */
 	@Override 
    public int isNullable(int column) throws SQLException;
    
    /* Get display size for the column. */
 	@Override 
    public int getColumnDisplaySize(int column) throws SQLException;
    
    /* Get column name according to column index. */
 	@Override 
    public String getColumnLabel(int column) throws SQLException;
    public String getColumnName(int column) throws SQLException;
    
    /* Get precision for the column index. */
 	@Override 
    public int getPrecision(int column) throws SQLException;
    
    /* Get column type id for the column index. */
 	@Override 
    public int getColumnType(int column) throws SQLException;
    
    /* Get column type name for the column index. */
 	@Override 
    public String getColumnTypeName(int column) throws SQLException;
    
    /* Get column type class name for the column index. */
 	@Override 
    public String getColumnClassName(int column) throws SQLException;
}

Unsupported Features

  1. Don't support transaction such as commit, rollback
  2. Don't support prepare statement, prepare call and etc operations
  3. Don't support management operations such as savepoint and etc

Exception Handling

When an error occurs, Flink Jdbc Driver mainly throws the following exceptions

SQLState ClassSQLState SubClassReasonExceptionOperations
22000 to 02H according to different errorsDescription of data conversion errorSQLDataExceptionGet data error from ResultSet in methods getXXX
0A000Specific feature is not supportedSQLFeatureNotSupportedExceptionAll unimplemented methods will throw this exception
58004The exception or error message from GatewaySQLNonTransientExceptionGateway throws an exception or returns an error message when executing the query
08006The session is not exist in Gateway and client need to create new connection to itSQLNonTransientConnectionExceptionGateway is restarted and the client need to create new connection

We can continue to subdivide and throw different exceptions according to the error information returned by the Gateway in Flink Jdbc Driver in the future



[1] https://github.com/ververica/flink-sql-gateway

[2] https://github.com/ververica/flink-jdbc-driver