最近在测试hive0.11 hiveserver时遇到的一个关于认证的bug,具体表现:
在配置中指定了custom的认证方式时,通过beeline连接hiveserver2,发现连接hang住。
hive配置:

hive.server2.authentication
a
CUSTOM
hive.server2.custom.authentication.class
com.vipshop.hive.service.AuthWithPasswd

查看hiveserver的日志,发现有如下报错:

15/01/08 17:54:59 ERROR server.TThreadPoolServer: Error occurred during processing of message.java.lang.RuntimeException: java.lang.NoSuchMethodException: org.apache.hive.service.auth.PasswdAuthenticationProvider.
()        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)        at org.apache.hive.service.auth.CustomAuthenticationProviderImpl.
(CustomAuthenticationProviderImpl.java:52)        at org.apache.hive.service.auth.AuthenticationProviderFactory.getAuthenticationProvider(AuthenticationProviderFactory.java:62)        at org.apache.hive.service.auth.PlainSaslHelper$PlainServerCallbackHandler.handle(PlainSaslHelper.java:73)        at org.apache.hive.service.auth.PlainSaslServer.evaluateResponse(PlainSaslServer.java:102)        at org.apache.thrift.transport.TSaslTransport$SaslParticipant.evaluateChallengeOrResponse(TSaslTransport.java:509)        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:264)        at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)        at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:189)        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)        at java.lang.Thread.run(Thread.java:662)Caused by: java.lang.NoSuchMethodException: org.apache.hive.service.auth.PasswdAuthenticationProvider.
()        at java.lang.Class.getConstructor0(Class.java:2706)        at java.lang.Class.getDeclaredConstructor(Class.java:1985)        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:125)        ... 12 more

注释掉hive.server2.authentication的配置(即使用默认的none)后正常。这其实是hive0.11.0中的bug,在hive0.13.0中fix,bug id:

https://issues.apache.org/jira/browse/HIVE-4778

下面分析下具体涉及的类:
org.apache.hive.service.auth.HiveAuthFactory中的AuthTypes定义了多种认证方法(NONE,LDAP,KERBEROS,CUSTOM,PAM(hive0.11不支持)等)这里我们用到了CUSTOM,基于user和password的认证方式,涉及到PasswdAuthenticationProvider和CustomAuthenticationProviderImpl类
1)org.apache.hive.service.auth.PasswdAuthenticationProvider是一个基于用户名和密码验证的接口,主要定义了一个抽象方法Authenticate (参数就是用户名和密码)
2)org.apache.hive.service.auth.AnonymousAuthenticationProviderImpl类实现了PasswdAuthenticationProvider接口,提供了一个空的Authenticate方法(直接return)
3)org.apache.hive.service.auth.AuthenticationProviderFactory定义了一个AuthMethods的enum类,定义了有效的认证方法(hive0.13:LDAP/PAM/CUSTOM/NONE,hive0.11:LDAP/CUSTOM/NONE),并提供了对应的封装,同时定义了一个getAuthenticationProvider方法,用来返回对应的 AuthMethods的具体实现类:

  public static PasswdAuthenticationProvider getAuthenticationProvider(AuthMethods authMethod)      throws AuthenticationException {    if (authMethod.equals(AuthMethods.LDAP)) {      return new LdapAuthenticationProviderImpl();    }    else if (authMethod.equals(AuthMethods.PAM)) {      return new PamAuthenticationProviderImpl();    }    else if (authMethod.equals(AuthMethods.CUSTOM)) {      return new CustomAuthenticationProviderImpl();    }    else if (authMethod.equals(AuthMethods.NONE)) {      return new AnonymousAuthenticationProviderImpl();    }    else {      throw new AuthenticationException("Unsupported authentication method");    }  }

4)org.apache.hive.service.auth.CustomAuthenticationProviderImpl

PasswdAuthenticationProvider接口的具体实现类
在hive0.13中的实现:

public class CustomAuthenticationProviderImpl implements PasswdAuthenticationProvider {  Class
 customHandlerClass;  PasswdAuthenticationProvider customProvider;  @SuppressWarnings("unchecked")  CustomAuthenticationProviderImpl () {    HiveConf conf = new HiveConf();    this.customHandlerClass = (Class
)        conf.getClass(             HiveConf.ConfVars.HIVE_SERVER2_CUSTOM_AUTHENTICATION_CLASS.varname, //即hive.server2.custom.authentication.class            PasswdAuthenticationProvider.class);    //HIVE_SERVER2_CUSTOM_AUTHENTICATION_CLASS("hive.server2.custom.authentication.class", null),    this.customProvider =        ReflectionUtils.newInstance(this.customHandlerClass, conf); //反射调用,生成class的实例  }  @Override  public void Authenticate(String user, String  password) //实现具体的Authenticate方法,用于实现的验证      throws AuthenticationException {    this.customProvider.Authenticate(user, password);  }}

hive0.11中的实现:

public class CustomAuthenticationProviderImpl  implements PasswdAuthenticationProvider {  Class
 customHandlerClass;  PasswdAuthenticationProvider customProvider;  @SuppressWarnings("unchecked")  CustomAuthenticationProviderImpl () {    HiveConf conf = new HiveConf();    this.customHandlerClass = (Class
)        conf.getClass(            HiveConf.ConfVars.HIVE_SERVER2_CUSTOM_AUTHENTICATION_CLASS.name(), //即HIVE_SERVER2_CUSTOM_AUTHENTICATION_CLASS            PasswdAuthenticationProvider.class);    this.customProvider =        ReflectionUtils.newInstance(this.customHandlerClass, conf);  }  @Override  public void Authenticate(String user, String  password)      throws AuthenticationException {    this.customProvider.Authenticate(user, password);  }}

不同的就是customHandlerClass 的获取方法,这里我们手动测试下:

import java.lang.*;public class HiveConf{  public static enum ConfVars {    PLAN_SERIALIZATION("hive.plan.serialization.format","kryo"),    HIVE_SERVER2_CUSTOM_AUTHENTICATION_CLASS("hive.server2.custom.authentication.class", null),    ;    public final String varname;    public final String defaultVal;    ConfVars(String varname, String defaultVal) {      this.varname = varname;      this.defaultVal = defaultVal;    }    public String toString() {      return varname;    }    }    public static void main(String args[]) {        System.out.println("ConfVars List:");        for(ConfVars c:ConfVars.values()){ //ConfVars.values()是所有enum的值            System.out.println(c  + " is: " + c);//hive.server2.custom.authentication.class is: hive.server2.custom.authentication.class            System.out.println(c  + " varname: " + c.varname); //可以看到这里varname是enum元素定义的值的名称,而name()是enum元素的名称//hive.server2.custom.authentication.class varname: hive.server2.custom.authentication.class            System.out.println(c  + " name(): " + c.name());//hive.server2.custom.authentication.class name(): HIVE_SERVER2_CUSTOM_AUTHENTICATION_CLASS        }        System.out.println(HiveConf.ConfVars.HIVE_SERVER2_CUSTOM_AUTHENTICATION_CLASS.name()); //HIVE_SERVER2_CUSTOM_AUTHENTICATION_CLASS        System.out.println(HiveConf.ConfVars.HIVE_SERVER2_CUSTOM_AUTHENTICATION_CLASS.varname); //hive.server2.custom.authentication.class    }}

可以看到使用HiveConf.ConfVars.HIVE_SERVER2_CUSTOM_AUTHENTICATION_CLASS.varname时,this.customHandlerClass的结果为具体的实现类(即hive.server2.custom.authentication.class设置的类)比如class com.vipshop.hive.service.AuthWithPasswd,而使用HiveConf.ConfVars.HIVE_SERVER2_CUSTOM_AUTHENTICATION_CLASS.name()时,返回的是接口,即interface org.apache.hive.service.auth.PasswdAuthenticationProvider

再来看ReflectionUtils.newInstance方法:

private static final Class
[] EMPTY_ARRAY = new Class[]{};....  public static 
 T newInstance(Class
 theClass, Configuration conf) {    T result;    try {      Constructor
 meth = (Constructor
) CONSTRUCTOR_CACHE.get(theClass);      if (meth == null) {        meth = theClass.getDeclaredConstructor(EMPTY_ARRAY);  //返回类的构造函数对象        meth.setAccessible(true);        CONSTRUCTOR_CACHE.put(theClass, meth);      }      result = meth.newInstance();    } catch (Exception e) {      throw new RuntimeException(e);    }    setConf(result, conf);    return result;  }

在hive0.11时这里theClass为org.apache.hive.service.auth.PasswdAuthenticationProvider,org.apache.hive.service.auth.PasswdAuthenticationProvider是一个接口,没有定义构造函数,因此会抛出异常。

再来看认证的配置是在什么时候加载的?我们通过指定一个错误的配置来看其报错堆栈:
使用如下命令开启hiveserver的debug log:

bin/hiveserver2 -hiveconf hive.root.logger=DEBUG,console start
15/01/08 17:56:55 INFO service.AbstractService: Service:ThriftBinaryCLIService is started.15/01/08 17:56:55 INFO service.AbstractService: Service:HiveServer2 is started.15/01/08 17:56:55 ERROR thrift.ThriftCLIService: Error: javax.security.auth.login.LoginException: Unsupported authentication type CUSTEM        at org.apache.hive.service.auth.HiveAuthFactory.getAuthTransFactory(HiveAuthFactory.java:148)        at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:43)        at java.lang.Thread.run(Thread.java:662)

在hiveserver2正常启动时默认会启动ThriftBinaryCLIService服务:

在ThriftBinaryCLIService 类的run方法:

  public void run() {    try {      hiveAuthFactory = new HiveAuthFactory();  //run方法中首先会声明一个HiveAuthFactory对象      TTransportFactory  transportFactory = hiveAuthFactory.getAuthTransFactory(); //调用HiveAuthFactory的getAuthTransFactory方法获取对应的TTransportFactory对象        TProcessorFactory processorFactory = hiveAuthFactory.getAuthProcFactory(this);.....

而在HiveAuthFactory的构造函数中会解析hive的配置,获取对应的hiveserver的认证设置:

  public HiveAuthFactory() throws TTransportException {    conf = new HiveConf();    transportMode = conf.getVar(HiveConf.ConfVars.HIVE_SERVER2_TRANSPORT_MODE);//HIVE_SERVER2_TRANSPORT_MODE("hive.server2.transport.mode", "binary",new StringsValidator("binary", "http")),    authTypeStr = conf.getVar(HiveConf.ConfVars.HIVE_SERVER2_AUTHENTICATION);//HIVE_SERVER2_AUTHENTICATION("hive.server2.authentication", "NONE", new StringsValidator("NOSASL", "NONE", "LDAP", "KERBEROS", "PAM", "CUSTOM")),默认为null,有效值为"NOSASL", "NONE", "LDAP", "KERBEROS", "PAM", "CUSTOM"    // In http mode we use NOSASL as the default auth type    if (transportMode.equalsIgnoreCase("http")) {      if (authTypeStr == null) {        authTypeStr = AuthTypes.NOSASL.getAuthName();      }    }    else {      if (authTypeStr == null) {        authTypeStr = AuthTypes.NONE.getAuthName();      }      if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName())          && ShimLoader.getHadoopShims().isSecureShimImpl()) {        saslServer = ShimLoader.getHadoopThriftAuthBridge().createServer(            conf.getVar(ConfVars.HIVE_SERVER2_KERBEROS_KEYTAB),            conf.getVar(ConfVars.HIVE_SERVER2_KERBEROS_PRINCIPAL)            );        // start delegation token manager        try {          saslServer.startDelegationTokenSecretManager(conf, null);        } catch (IOException e) {          throw new TTransportException("Failed to start token manager", e);        }      }    }  }

getAuthTransFactory方法会判断authTypeStr是否为有效值,否则抛出异常,退出启动

  public TTransportFactory getAuthTransFactory() throws LoginException {    TTransportFactory transportFactory;    if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName())) {      try {        transportFactory = saslServer.createTransportFactory(getSaslProperties());      } catch (TTransportException e) {        throw new LoginException(e.getMessage());      }    } else if (authTypeStr.equalsIgnoreCase(AuthTypes.NONE.getAuthName())) {      transportFactory = PlainSaslHelper.getPlainTransportFactory(authTypeStr);    } else if (authTypeStr.equalsIgnoreCase(AuthTypes.LDAP.getAuthName())) {      transportFactory = PlainSaslHelper.getPlainTransportFactory(authTypeStr);    } else if (authTypeStr.equalsIgnoreCase(AuthTypes.PAM.getAuthName())) {      transportFactory = PlainSaslHelper.getPlainTransportFactory(authTypeStr);    } else if (authTypeStr.equalsIgnoreCase(AuthTypes.NOSASL.getAuthName())) {      transportFactory = new TTransportFactory();    } else if (authTypeStr.equalsIgnoreCase(AuthTypes.CUSTOM.getAuthName())) {      transportFactory = PlainSaslHelper.getPlainTransportFactory(authTypeStr);    } else {      throw new LoginException("Unsupported authentication type " + authTypeStr);    }    return transportFactory;  }