Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  1. executeSqlWe propose to deprecate the following methods:

    • TableEnvironment.sqlUpdate(String)

    • TableEnvironment.insertInto(String, Table)
    • TableEnvironment.execute(String)
    • TableEnvironment.explain(boolean)
    • TableEnvironment.fromTableSource(TableSource<?>)
    • Table.insertInto(String)

  2. meanwhile, we propose to introduce the following new methods:

    Code Block
    languagejava
    titleNew methods in TableEnvironment
    interface TableEnvironment {
        // execute the given single statement, and return the execution result.
    	TableResult executeSql(String statement);
        
        // get the AST and the execution plan for the given single statement (DQL, DML)
        String explainSql(String statement, ExplainDetail... extraDetails);
    
        // create a StatementSet instance which can add DML statements or Tables
        // to the set and explain or execute them as a batch.
        StatementSet createStatementSet();
    }


    Code Block
    languagejava
    titleNew methods in Table
    interface Table {
        // write the Table to a TableSink that was registered
        // under the specified path.
    	TableResult executeInsert(String tablePath);
        
        // write the Table to a TableSink that was registered
        // under the specified path.
        TableResult executeInsert(String tablePath, boolean overwrite);
    
        // create a StatementSet instance which can add DML statements or Tables
        // to the set and explain or execute them as a batch.
        String explain(ExplainDetail... extraDetails);
    
        // get the contents of the current table.
        TableResult execute();
    }


    Code Block
    languagejava
    titleNew class: TableResult
    interface TableResult {
         // return JobClient if a Flink job is submitted
         // (e.g. for DML/DQL statement), else return empty (e.g. for DDL).
        Optional<JobClient> getJobClient();
    
        // return the schema of the result
    	TableSchema getTableSchema();
        
        // return the ResultKind which can avoid custom parsing of
        // an "OK" row in programming
        ResultKind getResultKind();
    
        // get the row contents as an iterable rows
        Iterator<Row> collect();
    
        // print the result contents
        void print();
    }


    Code Block
    languagejava
    titleNew class: ResultKind
    public enum ResultKind {
        // for DDL, DCL and statements with a simple "OK" 
    	SUCCESS,
    
        // rows with important content are available (DML, DQL) 
        SUCCESS_WITH_CONTENT
    }


    Code Block
    languagejava
    titleNew class: StatementSet
    interface StatementSet  {
        // add single INSERT statement into the set
        StatementSet addInsertSql(String statement);
    
        // add Table with the given sink table name to the set
        StatementSet addInsert(String targetPath, Table table);
    
        // add Table with the given sink table name to the set
    	StatementSet addInsert(String targetPath, Table table, boolean overwrite);
    
        // returns the AST and the execution plan to compute 
        // the result of all statements and Tables
        String explain(ExplainDetail... extraDetails);
    
        // execute all statements and Tables as a batch
        TableResult execute();
    }


    Code Block
    languagejava
    titleNew class: ExplainDetail
    public enum ExplainDetail {
       STATE_SIZE_ESTIMATE,
       UID,
       HINTS,
       ...
    }


  3. For current messy Flink table program trigger point, we propose that: for TableEnvironment and StreamTableEnvironment, you must use `TableEnvironment.execute()` to trigger table program execution, once you convert the table program to a DataStream program (through `toAppendStream` or `toRetractStream` method), you must use `StreamExecutionEnvironment.execute` to trigger the DataStream program.
    Similar rule for BatchTableEnvironment, you must use `TableEnvironment.execute()` to trigger batch table program execution, once you convert the table program (through `toDataSet` method) to a DataSet program, you must use `ExecutionEnvironment.execute` to trigger the DataSet program.

...

Code Block
languagejava
titleNew method in TableEnvironment
interface TableEnvironment {
     /**
      * Execute the given single statement and 
      * the statement can be DDL/DML/SHOW/DESCRIBE/EXPLAIN/USE. 
      * 
      * If the statement is translated to a Flink job (e.g. DML/DQL), 
      * the TableResult will be returned until the job is submitted, and 
      * contains a JobClient instance to associate the job.
      * Else, the TableResult will be returned until the statement 
      * execution is finished, does not contain a JobClient instance.
      * 
      * @return result for SHOW/DESCRIBE/EXPLAIN, the affected row count 
      * for `DML` (-1 means unknown), or a string message ("OK") for other  
      * statements.
      */
	TableResult executeSql(String statement);
}

...

This method only supports executing a single statement which can be DDL, DML, SHOW, DESCRIBE, EXPLAIN and USE.  For DML and DQL, this method returns TableResult until once the job is has been submitted. For other DDL and DCL statements, TableResult is returned until the execution is once the operation has finished. TableResult is the representation of the execution result, and contains the result data and the result schema. TableResult contains a JobClient which associates the job if the statement is DML. 

...