Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Added examples of proposed alternative provider config formats

...

  • Identify the nature of all the supported services
    • Which services are purely topology-based?
      • WEBHCAT
      • HBASE
      • OOZIE
      • WEBHDFS
      • ?

    • Which services are currently ZooKeeper-based?
      • HIVE (HS2ZooKeeperURLManager)
      • HBASE (HBaseZooKeeperURLManager)
      • Kafka (KafkaZooKeeperURLManager)
      • SOLR (SOLRZooKeeperURLManager)
      • ?

    • Could ZooKeeper support be added for any services which do no currently support it?

  • For the ZooKeeper-based-HA services, determine if the ZooKeeper details are available from the service's configuration via Ambari.

  • Can "HA mode" be determined from the cluster configuration details? Can Knox dynamically identify HA-configured services, and generate the topology accordingly?

  • Determine how to leverage the cluster discovery data to generate the ZooKeeper HA configuration for the relevant declared topology services.

  • Investigate moving cluster-specific service HA configuration from the HaProvider configuration to the service declaration.
    • This just makes more logical sense
    • Moving cluster-specific details out of the provider configuration will make shared provider configurations applicable to more topologies (i.e., more reusable), across clusters.
    • The cluster-specific configuration could be discovered along with the service URL details, rather than having to be hard-coded.

 

 

...

Proposed Alternative Provider Configuration File Formats

JSON

Code Block
languagejs
titleJSON Provider Configuration
{
  "providers": [
    {
      "role":"authentication",
      "name":"ShiroProvider",
      "enabled":"true",
      "params":{
        "sessionTimeout":"30",
        "main.ldapRealm":"org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm",
        "main.ldapContextFactory":"org.apache.hadoop.gateway.shirorealm.KnoxLdapContextFactory",
        "main.ldapRealm.contextFactory":"$ldapContextFactory",
        "main.ldapRealm.userDnTemplate":"uid={0},ou=people,dc=hadoop,dc=apache,dc=org",
    	"main.ldapRealm.contextFactory.url":"ldap://localhost:33389",
	    "main.ldapRealm.contextFactory.authenticationMechanism":"simple",
	    "urls./**":"authcBasic"
	  }
    },
	{
	  "role":"hostmap",
	  "name":"static",
	  "enabled":"true",
	  "params":{
	    "localhost":"sandbox,sandbox.hortonworks.com"
	  }
	},
	{
	  "role":"ha",
	  "name":"HaProvider",
	  "enabled":"true",
	  "params":{
	    "WEBHDFS":"maxFailoverAttempts=3;failoverSleep=1000;maxRetryAttempts=300;retrySleep=1000;enabled=true",
		"HIVE":"maxFailoverAttempts=3;failoverSleep=1000;enabled=true"
	  }
	}
  ]
}


YAML 

Code Block
languagejs
titleYAML Provider Configuration
---
providers:
  - role: authentication
    name: ShiroProvider
    enabled: true
    params:
      sessionTimeout: 30
      main.ldapRealm: org.apache.hadoop.gateway.shirorealm.KnoxLdapRealm
      main.ldapContextFactory: org.apache.hadoop.gateway.shirorealm.KnoxLdapContextFactory
      main.ldapRealm.contextFactory: $ldapContextFactory
      main.ldapRealm.userDnTemplate: uid={0},ou=people,dc=hadoop,dc=apache,dc=org
      main.ldapRealm.contextFactory.url: ldap://localhost:33389
      main.ldapRealm.contextFactory.authenticationMechanism: simple
      urls./**: authcBasic
  - role: hostmap
    name: static
    enabled: true
    params:
      localhost: sandbox,sandbox.hortonworks.com
  - role: ha
    name: HaProvider
    enabled: true
    params:
      WEBHDFS: maxFailoverAttempts=3;failoverSleep=1000;maxRetryAttempts=300;retrySleep=1000;enabled=true
      HIVE: maxFailoverAttempts=3;failoverSleep=1000;enabled=true