Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  1. Enable users to integrate Flink with cores and built-in objects of other systems, so users can reuse whatever they are familiar with in other SQL systems seamlessly as core and built-ins of Flink SQL and Table 
  2. Enpower users to write code and do customized developement for Flink table core

Plugins Modules define a set of metadata, including functions, user defined types, operators, rules, etc. Prebuilt plugins modules will be added and provided, or users may choose write their own. Flink will take metadata from plugins modules as extensions of its core built-in system that users can take advantages of. For example, users can define their own geo functions and geo data types and plug them into Flink table as built-in objects. Another example is users can use an out-of-shelf Hive plugin module to use Hive built-in functions as part of Flink built-in functions.

...

In this FLIP we’ll design and develop a generic mechanism for modular plugins in pluggable modules in Flink table core, with a focus on built-in functions.

We’ll specifically create two plugin implementations module implementations in this FLIP

  • CorePluginCoreModule, with existing Flink built-in functions only
  • HivePluginHiveModule, supporting Hive built-in functions and numerous Hive versions

...

Users have to be fully aware of the consequences of resetting modules as that might cause that some objects can not be referenced anymore or resolution order of some objects changes. E.g. “CAST” and “AS” cannot be overriden in CoreModule and users should be fully aware of that.

How to Load

...

Modules

To load pluginsmodules, users have to make sure relevant classes are already in classpath.

...

interface ModuleFactory extends TableFactory {
   Plugin createModule Module createModule(String name, Map<String, String> properties);
}

...

class CoreModuleFactory {
  @Override
  public Map<String, String> requiredContext() {
    Map<String, String> context = new HashMap<>();
    context.put("type", "core");
    return context;
  }

  @Override
  public List<String> supportedProperties() {
    return Collections.EMPTY_LIST;
  }

  @Override
  public Module createModule(String name, Map<String, String> properties) {
    return CoreModule.INSTANCE;
  }
}

...

ModuleManager is responsible for loading all the Pluginsmodules, managing their life cycles, and resolve module objects.

public class ModuleManager {
  private List<LinkedHashMap<String, Module> modules;

  public ModuleManager() {
    this.modules = new ArrayList<>(LinkedHashMap<>();

    modules.put("core", CoreModule.INSTANCE);
  }

  public void loadModule(String name, Module module) { ... }

  public void unloadModule(String name) { ... }

  public Set<Set<String>> Set<String> listFunctions() {
    return pluginsmodules.stream()
        .map(p -> p.listFunctions())

        .flatmap(e → e.stream())
        .collect(Collectors.toSet());
  }

  public Optional<FunctionDefinition> getFunctionDefinition(String name) {
    Optional<Plugin> Optional<Module> p = pluginsmodules.stream()
        .filter(p -> p.listFunctions().contains(name))
        .findFirst();

    return p.isPresent() ? p.get().getFunctionDefinition(name) : Optional.empty();
  }


// addUserDefinedTypes(), getUserDefinedTypes(), etc
}

...

FunctionCatalog will hold PluginManager ModuleManager to resolve built-in functions.

...

As mention above, though this FLIP provides a generic design and mechanism for all plugin object module object types we want to support, we will only implement functions. Other objects can be added incrementally later on.

...

...