2016-09-09 16 views
4

自分のローカルコードを使用して、spark-sqlでリモートハイブに接続します。 これは私のコードです:org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClientをインスタンス化できません

package src.main.scala 

import org.apache.spark.{SparkConf, SparkContext} 
import com.datastax.spark.connector.cql.CassandraConnector 
import org.apache.spark.SparkConf 
import org.apache.spark.{SparkConf, SparkContext} 
import org.apache.spark.sql.hive.HiveContext 
import com.datastax.spark.connector._ 

object hive_Test { 
    def main(args: Array[String]){ 
    val conf = new SparkConf() 
      .setMaster("spark://hadoop-s1:7077") 
      .setAppName("kof-spark-hive") 
    System.setProperty("hive.metastore.uris", "thrift://hadoop-s4:9083"); 
    val sc = new SparkContext(conf) 
    val sqlContext = new HiveContext(sc) 
    sqlContext.sql("show databases").collect().foreach(println) 
    sc.stop() 

    } 
} 

しかし、それは分解し、いくつかの例外をスロー:

com.intellij.rt.execution.application.AppMain src.main.scala.hive_Test 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
16/09/09 10:36:31 INFO SparkContext: Running Spark version 1.6.0 
16/09/09 10:36:33 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
16/09/09 10:36:33 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path 
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. 
    at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:381) 
    at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:396) 
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:389) 
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79) 
    at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:130) 
    at org.apache.hadoop.security.Groups.<init>(Groups.java:94) 
    at org.apache.hadoop.security.Groups.<init>(Groups.java:74) 
    at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:303) 
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283) 
    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260) 
    at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:790) 
    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:760) 
    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:633) 
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2163) 
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2163) 
    at scala.Option.getOrElse(Option.scala:120) 
    at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2163) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:323) 
    at src.main.scala.hive_Test$.main(hive_Test.scala:22) 
    at src.main.scala.hive_Test.main(hive_Test.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) 
16/09/09 10:36:33 INFO SecurityManager: Changing view acls to: huaicui 
16/09/09 10:36:33 INFO SecurityManager: Changing modify acls to: huaicui 
16/09/09 10:36:33 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(huaicui); users with modify permissions: Set(huaicui) 
16/09/09 10:36:33 INFO Utils: Successfully started service 'sparkDriver' on port 54961. 
16/09/09 10:36:34 INFO Slf4jLogger: Slf4jLogger started 
16/09/09 10:36:34 INFO Remoting: Starting remoting 
16/09/09 10:36:34 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:54974] 
16/09/09 10:36:34 INFO Remoting: Remoting now listens on addresses: [akka.tcp://[email protected]:54974] 
16/09/09 10:36:34 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 54974. 
16/09/09 10:36:34 INFO SparkEnv: Registering MapOutputTracker 
16/09/09 10:36:34 INFO SparkEnv: Registering BlockManagerMaster 
16/09/09 10:36:34 INFO DiskBlockManager: Created local directory at C:\Users\huaicui\AppData\Local\Temp\blockmgr-cf28f2a3-bae4-4566-8c06-492a85ca4554 
16/09/09 10:36:34 INFO MemoryStore: MemoryStore started with capacity 945.5 MB 
16/09/09 10:36:34 INFO SparkEnv: Registering OutputCommitCoordinator 
16/09/09 10:36:35 INFO Utils: Successfully started service 'SparkUI' on port 4040. 
16/09/09 10:36:35 INFO SparkUI: Started SparkUI at http://10.140.200.141:4040 
16/09/09 10:36:35 INFO AppClient$ClientEndpoint: Connecting to master spark://hadoop-s1:7077... 
16/09/09 10:36:35 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20160908223603-0005 
16/09/09 10:36:35 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54994. 
16/09/09 10:36:35 INFO NettyBlockTransferService: Server created on 54994 
16/09/09 10:36:35 INFO BlockManagerMaster: Trying to register BlockManager 
16/09/09 10:36:35 INFO BlockManagerMasterEndpoint: Registering block manager 10.140.200.141:54994 with 945.5 MB RAM, BlockManagerId(driver, 10.140.200.141, 54994) 
16/09/09 10:36:35 INFO BlockManagerMaster: Registered BlockManager 
16/09/09 10:36:35 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0 
16/09/09 10:36:36 INFO HiveContext: Initializing execution hive, version 1.1.0 
16/09/09 10:36:36 INFO ClientWrapper: Inspected Hadoop version: 2.6.0-cdh5.8.0 
16/09/09 10:36:36 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.8.0 
16/09/09 10:36:37 INFO metastore: Trying to connect to metastore with URI thrift://hadoop-s4:9083 
16/09/09 10:36:37 INFO metastore: Opened a connection to metastore, current connections: 1 
16/09/09 10:36:37 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path 
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. 
    at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:381) 
    at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:396) 
    at org.apache.hadoop.util.Shell.getGroupsForUserCommand(Shell.java:147) 
    at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.createGroupExecutor(ShellBasedUnixGroupsMapping.java:100) 
    at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:125) 
    at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:72) 
    at org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.getGroups(JniBasedUnixGroupsMappingWithFallback.java:51) 
    at org.apache.hadoop.security.Groups$GroupCacheLoader.fetchGroupList(Groups.java:239) 
    at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:220) 
    at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:208) 
    at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599) 
    at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2379) 
    at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342) 
    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257) 
    at com.google.common.cache.LocalCache.get(LocalCache.java:4000) 
    at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4004) 
    at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874) 
    at org.apache.hadoop.security.Groups.getGroups(Groups.java:182) 
    at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1553) 
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:440) 
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238) 
    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1501) 
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67) 
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82) 
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3024) 
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3043) 
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3268) 
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:215) 
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:201) 
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:312) 
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:273) 
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:248) 
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513) 
    at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194) 
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238) 
    at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:220) 
    at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:210) 
    at org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:464) 
    at org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:463) 
    at org.apache.spark.sql.UDFRegistration.<init>(UDFRegistration.scala:40) 
    at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:330) 
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90) 
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) 
    at src.main.scala.hive_Test$.main(hive_Test.scala:23) 
    at src.main.scala.hive_Test.main(hive_Test.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) 
16/09/09 10:36:37 WARN Hive: Failed to register all functions. 
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1503) 
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67) 
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82) 
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3024) 
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3043) 
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3268) 
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:215) 
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:201) 
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:312) 
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:273) 
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:248) 
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513) 
    at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194) 
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238) 
    at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:220) 
    at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:210) 
    at org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:464) 
    at org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:463) 
    at org.apache.spark.sql.UDFRegistration.<init>(UDFRegistration.scala:40) 
    at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:330) 
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90) 
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) 
    at src.main.scala.hive_Test$.main(hive_Test.scala:23) 
    at src.main.scala.hive_Test.main(hive_Test.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) 
Caused by: java.lang.reflect.InvocationTargetException 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1501) 
    ... 28 more 
Caused by: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.NullPointerException 
    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2263) 
    at com.google.common.cache.LocalCache.get(LocalCache.java:4000) 
    at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4004) 
    at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874) 
    at org.apache.hadoop.security.Groups.getGroups(Groups.java:182) 
    at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1553) 
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:440) 
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238) 
    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 
    ... 33 more 
Caused by: java.lang.NullPointerException 
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010) 
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:522) 
    at org.apache.hadoop.util.Shell.run(Shell.java:481) 
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:763) 
    at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:129) 
    at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:72) 
    at org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.getGroups(JniBasedUnixGroupsMappingWithFallback.java:51) 
    at org.apache.hadoop.security.Groups$GroupCacheLoader.fetchGroupList(Groups.java:239) 
    at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:220) 
    at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:208) 
    at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599) 
    at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2379) 
    at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342) 
    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257) 
    ... 41 more 
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:540) 
    at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194) 
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238) 
    at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:220) 
    at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:210) 
    at org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:464) 
    at org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:463) 
    at org.apache.spark.sql.UDFRegistration.<init>(UDFRegistration.scala:40) 
    at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:330) 
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90) 
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) 
    at src.main.scala.hive_Test$.main(hive_Test.scala:23) 
    at src.main.scala.hive_Test.main(hive_Test.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) 
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:206) 
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:312) 
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:273) 
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:248) 
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513) 
    ... 17 more 
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1503) 
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67) 
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82) 
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3024) 
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3043) 
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3268) 
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:215) 
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:201) 
    ... 21 more 
Caused by: java.lang.reflect.InvocationTargetException 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1501) 
    ... 28 more 
Caused by: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.NullPointerException 
    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2263) 
    at com.google.common.cache.LocalCache.get(LocalCache.java:4000) 
    at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4004) 
    at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874) 
    at org.apache.hadoop.security.Groups.getGroups(Groups.java:182) 
    at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1553) 
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:440) 
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238) 
    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 
    ... 33 more 
Caused by: java.lang.NullPointerException 
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010) 
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:522) 
    at org.apache.hadoop.util.Shell.run(Shell.java:481) 
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:763) 
    at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:129) 
    at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:72) 
    at org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.getGroups(JniBasedUnixGroupsMappingWithFallback.java:51) 
    at org.apache.hadoop.security.Groups$GroupCacheLoader.fetchGroupList(Groups.java:239) 
    at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:220) 
    at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:208) 
    at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599) 
    at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2379) 
    at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342) 
    at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257) 
    ... 41 more 
16/09/09 10:36:37 INFO SparkContext: Invoking stop() from shutdown hook 
16/09/09 10:36:37 INFO SparkUI: Stopped Spark web UI at http://10.140.200.141:4040 
16/09/09 10:36:37 INFO SparkDeploySchedulerBackend: Shutting down all executors 
16/09/09 10:36:37 INFO SparkDeploySchedulerBackend: Asking each executor to shut down 
16/09/09 10:36:37 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 
16/09/09 10:36:37 INFO MemoryStore: MemoryStore cleared 
16/09/09 10:36:37 INFO BlockManager: BlockManager stopped 
16/09/09 10:36:37 INFO BlockManagerMaster: BlockManagerMaster stopped 
16/09/09 10:36:37 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 
16/09/09 10:36:37 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 
16/09/09 10:36:37 INFO SparkContext: Successfully stopped SparkContext 
16/09/09 10:36:37 INFO ShutdownHookManager: Shutdown hook called 
16/09/09 10:36:37 INFO ShutdownHookManager: Deleting directory C:\Users\huaicui\AppData\Local\Temp\spark-904b401c-2283-4f95-a90f-0cc97dcae00e 
16/09/09 10:36:37 INFO ShutdownHookManager: Deleting directory C:\Users\huaicui\AppData\Local\Temp\spark-69a836bb-71f1-4db9-968b-5ff983ad5bea 
16/09/09 10:36:37 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 

Process finished with exit code 1 

私は例外がこのラインで販売されていることを発見します。val sqlContext =新しいHiveContext(SC)

スパークシェルでハイブに接続しようとしましたが、正常に動作します。

私を助けてください。

答えて

1

Windowsマシンで実行していますが、これは以下のエラーで失敗しています。

java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. 

https://github.com/steveloughran/winutilsから「winutils.exe」ファイルをダウンロードし、1つのフォルダ内に保存してください。このフォルダパスを指すHADOOP_HOME環境変数を設定します。

+0

これまでに試してみたことがありますが、まだ動作していません。 – Kof

+0

winutils.exeを使用した後、どのようなエラーが発生しましたか? 「winutils.exe」がないため、エラー「実行可能なnull¥bin¥winutils.exeを見つけることができませんでした」が表示されます。 http://teknosrc.com/spark-error-java-io-ioexception-could-not-locate-executable-null-bin-winutils-exe-hadoop-binaries/ – abaghel

+0

を確認してください。正常に動作します。私は環境を設定していないからです:System.setProperty( "hive.metastore.uris"、 "thrift:// hadoop-s4:9083");どうもありがとうございます。 – Kof

関連する問題