Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.


...

Page properties

...


Discussion thread

Discussion thread:

http://apache-flink-mailing-list-archive.1008284.n3.nabble.com/DISCUSS-FLIP-123-DDL-and-DML-compatibility-for-Hive-connector-td39633.html

JIRA: here (<- link to https://issues.apache.org/jira/browse/FLINK-XXXX)

...


Vote thread
JIRA

Jira
serverASF JIRA
serverId5aa69414-a9e9-3523-82ec-879b028fb15b
keyFLINK-17198

Release1.11


Please keep the discussion on the mailing list rather than commenting on the wiki (wiki discussions get unwieldy fast).

...

We need to allow users to specify the dialect in the sql-client yaml file. In PlannerContext::getSqlParserConfig, we’ll choose which parser to use And we'll implement a SqlParserImplFactory which can create the parser according to the dialect/conformance in use. Suppose  Suppose the new parser for Hive is named FlinkHiveSqlParserImpl, and then PlannerContext::getSqlParserConfig will be changed to something like this. The SqlParserImplFactory implementation, assuming it's named FlinkSqlParserImplFactory, will be something like this:

Code Block
languagejava
public class FlinkSqlParserImplFactory implements SqlParserImplFactory {

	private final SqlConformance conformance;

	public FlinkSqlParserImplFactory(SqlConformance conformance) {
		this.conformance = conformance;
	}

	@Override
	public SqlAbstractParserImpl getParser(Reader stream) {
		if (conformance == FlinkSqlConformance.HIVE) {
			return FlinkHiveSqlParserImpl.FACTORY.getParser(stream);
		} else {
			return FlinkSqlParserImpl.FACTORY.getParser(stream);
		}
	}
}


In PlannerContext::getSqlParserConfig, we’ll use FlinkSqlParserImplFactory to create the config:

Code Block
languagejava
	private SqlParser.Config getSqlParserConfig() {
   		return JavaScalaConversionUtil.<SqlParser.Config>toJava(getCalciteConfig(tableConfig).getSqlParserConfig()).orElseGet(
         				// we use Java lex because back ticks are easier than double quotes in programming
         				// and cases are preserved
         				() -> {
            					SqlConformance conformance = getSqlConformance();
            SqlParserImplFactory factory = conformance == FlinkSqlConformance.HIVE ?
                  FlinkHiveSqlParserImpl.FACTORY :
                  FlinkSqlParserImpl.FACTORY;
            					return SqlParser
                  							.configBuilder()
                  							.setParserFactory(factory)
                  new FlinkSqlParserImplFactory(conformance))
							.setConformance(conformance)
                  							.setLex(Lex.JAVA)
                  							.setIdentifierMaxLength(256)
                  							.build();
         				}
   		);
	}


New or Changed Public Interfaces

...

DatabaseSupportedCommentNot SupportedComment
CREATE
SHOW DATABASES LIKEShow databases filtering by a regular expression. Missing Catalog API.
DROP


ALTER


USE


SHOW


DESCRIBEWe don't have a TableEnvironment API for this. Perhaps it's easier to implement when FLIP-84 is in place.

TableCREATESupport specifying EXTERNAL, PARTITIONED BY, ROW FORMAT, STORED AS, LOCATION and table properties. Data types will also be in HiveQL syntax, e.g. STRUCTBucketed tables
DROP
CREATE LIKEWait for FLIP-110
ALTERInclude rename, update table properties, update SerDe properties, update fileformat and update location.CREATE ASMissing underlying functionalities, e.g. create the table when the job succeeds.
SHOW
Temporary tablesMissing underlying functionalities, e.g. removing the files of the temporary table when session ends.
DESCRIBE
SKEWED BY [STORED AS DIRECTORIES]Currently we don't use the skew info of a Hive table.


STORED BYWe don't support Hive table with a storage handler yet.


UNION type


TRANSACTIONAL tables


DROP PURGEData will be deleted w/o going to trash. Applies to either a table or partitions. Missing Catalog API.


TRUNCATERemove all rows from a table or partitions. Missing Catalog APIs.


TOUCH, PROTECTION, COMPACT, CONCATENATE, UPDATE COLUMNSApplies to either a table or partitions. Too Hive-specific or missing underlying functionalities.


SHOW TABLES 'regex'Show tables filtering by a regular expression. Missing Catalog API.


FOREIGN KEY, UNIQUE, DEFAULT, CHECKThese constraints are currently not used by the Hive connector.
PartitionALTERInclude add, drop, update fileformat and update location.Exchange, Discover, Retention, Recover, (Un)ArchiveToo Hive-specific or missing underlying functionalities.
SHOWSupport specifying partial specRENAMEUpdate a partition's spec. Missing Catalog API.
DESCRIBEWe don't have a TableEnvironment API for this. Perhaps it's easier to implement when FLIP-84 is in place.ALTER with partial specAlter multiple partitions matching a partial spec. Missing Catalog API.
ColumnALTERChange name, type, position, comment for a single column. Add new columns. Replace all columns.

FunctionCREATE
CREATE FUNCTION USING FILE|JAR…To support this, we need to be able to dynamically add resources to a session.
DROP
RELOADHive-specific
SHOW
SHOW FUNCTIONS LIKEShow functions filtering by a regular expression. Missing Catalog API.
ViewCREATEWait for FLIP-71SHOW VIEWS LIKEShow views filtering by a regular expression. Missing Catalog API.
DROPWait for FLIP-71

ALTERWait for FLIP-71

SHOWWait for FLIP-71

DESCRIBEWait for FLIP-71

...