2017-09-24 5 views
0

J'essaie d'accéder à CLI Hive. Toutefois, il ne parvient pas à démarrer avec le problème AccessControl suivant. Assez étrangement, je suis en mesure d'interroger les données de la ruche de Hue sans le problème AccessControl. Cependant, la CLI de la ruche ne fonctionne pas. Je suis sur un cluster MapR.Impossible de démarrer Hive CLI Hadoop (MapR)

Toute aide est très appréciée.

[<user_name>@<edge_node> ~]$ hive 
SLF4J: Class path contains multiple SLF4J bindings. 
SLF4J: Found binding in [jar:file:/opt/mapr/hive/hive-2.1/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: Found binding in [jar:file:/opt/mapr/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 
Logging initialized using configuration in file:/opt/mapr/hive/hive-2.1/conf/hive-log4j2.properties Async: true 
2017-09-23 23:52:08,988 WARN [main] DataNucleus.General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/mapr/spark/spark-2.1.0/jars/datanucleus-api-jdo-4.2.4.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/mapr/hive/hive-2.1/lib/datanucleus-api-jdo-4.2.1.jar." 
2017-09-23 23:52:08,993 WARN [main] DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/mapr/spark/spark-2.1.0/jars/datanucleus-core-4.1.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/mapr/hive/hive-2.1/lib/datanucleus-core-4.1.6.jar." 
2017-09-23 23:52:09,004 WARN [main] DataNucleus.General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/mapr/spark/spark-2.1.0/jars/datanucleus-rdbms-4.1.19.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/mapr/hive/hive-2.1/lib/datanucleus-rdbms-4.1.7.jar." 
2017-09-23 23:52:09,038 INFO [main] DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored 
2017-09-23 23:52:09,039 INFO [main] DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 
2017-09-23 23:52:14,2251 ERROR JniCommon fs/client/fileclient/cc/jni_MapRClient.cc:2172 Thread: 20235 mkdirs failed for /user/<user_name>, error 13 
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: User <user_name>(user id 50005586) has been denied access to create <user_name> 
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:617) 
at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531) 
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:714) 
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:646) 
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:606) 
at org.apache.hadoop.util.RunJar.run(RunJar.java:221) 
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 
Caused by: org.apache.hadoop.security.AccessControlException: User <user_name>(user id 50005586) has been denied access to create <user_name> 
at com.mapr.fs.MapRFileSystem.makeDir(MapRFileSystem.java:1256) 
at com.mapr.fs.MapRFileSystem.mkdirs(MapRFileSystem.java:1276) 
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1913) 
at org.apache.hadoop.hive.ql.exec.tez.DagUtils.getDefaultDestDir(DagUtils.java:823) 
at org.apache.hadoop.hive.ql.exec.tez.DagUtils.getHiveJarDirectory(DagUtils.java:917) 
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.createJarLocalResource(TezSessionState.java:616) 
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.openInternal(TezSessionState.java:256) 
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.beginOpen(TezSessionState.java:220) 
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:614) 
... 10 more 

Répondre

1

L'erreur indique que vous avez un accès défini pour créer un répertoire dans le système de fichiers. Ceci est probablement /user/<user name>, qui devra être ajouté par le super utilisateur HDFS/MapR FS.

Je suis en mesure d'interroger les données de la ruche de Hue sans AccessControl

Hue communique via Thrift et HiveServer2.

L'interface de ligne de commande Hive ignore le serveur HiveServer2 et est obsolète.

Vous devez utiliser Beeline à la place.

beeline -n $(whoami) -u jdbc:hive2://hiveserver:10000/default 

Et si vous êtes dans un cluster kerbérisé, vous aurez besoin d'options supplémentaires.