...
We need to allow users to specify the dialect in the sql-client yaml file. In PlannerContext::getSqlParserConfig
, we’ll choose which parser to use And we'll implement a SqlParserImplFactory
which can create the parser according to the dialect/conformance in use. Suppose Suppose the new parser for Hive is named FlinkHiveSqlParserImpl
, and then PlannerContext::getSqlParserConfig
will be changed to something like this. The SqlParserImplFactory
implementation, assuming it's named FlinkSqlParserImplFactory
, will be something like this:
Code Block | ||
---|---|---|
| ||
public class FlinkSqlParserImplFactory implements SqlParserImplFactory {
private final SqlConformance conformance;
public FlinkSqlParserImplFactory(SqlConformance conformance) {
this.conformance = conformance;
}
@Override
public SqlAbstractParserImpl getParser(Reader stream) {
if (conformance == FlinkSqlConformance.HIVE) {
return FlinkHiveSqlParserImpl.FACTORY.getParser(stream);
} else {
return FlinkSqlParserImpl.FACTORY.getParser(stream);
}
}
} |
In PlannerContext::getSqlParserConfig
, we’ll use FlinkSqlParserImplFactory
to create the config:
Code Block | ||
---|---|---|
| ||
private SqlParser.Config getSqlParserConfig() { return JavaScalaConversionUtil.<SqlParser.Config>toJava(getCalciteConfig(tableConfig).getSqlParserConfig()).orElseGet( // we use Java lex because back ticks are easier than double quotes in programming // and cases are preserved () -> { SqlConformance conformance = getSqlConformance(); SqlParserImplFactory factory = conformance == FlinkSqlConformance.HIVE ? FlinkHiveSqlParserImpl.FACTORY : FlinkSqlParserImpl.FACTORY; return SqlParser .configBuilder() .setParserFactory(factory) new FlinkSqlParserImplFactory(conformance)) .setConformance(conformance) .setLex(Lex.JAVA) .setIdentifierMaxLength(256) .build(); } ); } |
New or Changed Public Interfaces
...