2017-08-02 65 views
0

我正在嘗試訪問配置單元Metastore併爲此使用SparkSql。我已經安裝sparksession,但是當我運行我的程序和查看日誌我看到這個異常Spark不讀hive-site.xml?

Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522) 
    at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:188) 
    ... 61 more 
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523) 
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) 
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) 
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) 
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) 
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) 
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) 
    ... 62 more 
Caused by: java.lang.reflect.InvocationTargetException 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) 
    ... 68 more 
Caused by: javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ 
java.sql.SQLException: Failed to create database 'metastore_db', see the next exception for details. 

我運行一個servlet,其訪問下面的代碼

public class HiveReadone extends HttpServlet { 
    private static final long serialVersionUID = 1L; 

    /** 
    * @see HttpServlet#HttpServlet() 
    */ 
    public HiveReadone() { 
     super(); 
     // TODO Auto-generated constructor stub 
    } 

    /** 
    * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse response) 
    */ 
    protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { 
     // TODO Auto-generated method stub 
     response.getWriter().append("Served at: ").append(request.getContextPath()); 

     SparkSession spark = SparkSession 
       .builder() 
       .appName("Java Spark SQL basic example") 
       .enableHiveSupport() 
       .config("spark.sql.warehouse.dir", "hdfs://saurab:9000/user/hive/warehouse") 
       .config("mapred.input.dir.recursive", true) 
       .config("hive.mapred.supports.subdirectories", true) 
       .config("hive.vectorized.execution.enabled", true) 
       .master("local") 
       .getOrCreate(); 
     response.getWriter().println(spark); 

沒有獲取瀏覽器的打印從response.getWriter().append("Served at: ").append(request.getContextPath());接受輸出這是Served at: /hiveServ

請看看我的conf/hive-site.xml

<property> 
     <name>hive.metastore.schema.verification</name> 
     <value>false</value> 
    </property> 
    <property> 
     <name>hive.server2.enable.doAs</name> 
     <value>true</value> 
    </property> 
    <property> 
     <name>javax.jdo.option.ConnectionURL</name> 
     <value>jdbc:mysql://saurab:3306/metastore_db?createDatabaseIfNotExist=true</value> 
     <description>metadata is stored in a MySQL server</description> 
    </property> 
    <property> 
     <name>javax.jdo.option.ConnectionDriverName</name> 
     <value>com.mysql.jdbc.Driver</value> 
     <description>MySQL JDBC driver class</description> 
    </property> 
    <property> 
     <name>hive.aux.jars.path</name> 
     <value>/home/saurab/hadoopec/hive/lib/hive-serde-2.1.1.jar</value> 
    </property> 
    <property> 
     <name>spark.sql.warehouse.dir</name> 
     <value>hdfs://saurab:9000/user/hive/warehouse</value> 
    </property> 
    <property> 
     <name>hive.metastore.uris</name> 
     <!--Make sure that <value> points to the Hive Metastore URI in your cluster --> 
     <value>thrift://saurab:9083</value> 
     <description>URI for client to contact metastore server</description> 
    </property> 
    <property> 
     <name>hive.server2.thrift.port</name> 
     <value>10001</value> 
     <description>Port number of HiveServer2 Thrift interface. 
      Can be overridden by setting $HIVE_SERVER2_THRIFT_PORT 
     </description> 
    </property> 
    <property> 
     <name>javax.jdo.option.ConnectionUserName</name> 
     <value>hiveuser</value> 
     <description>user name for connecting to mysql server</description> 
    </property> 
    <property> 
     <name>javax.jdo.option.ConnectionPassword</name> 
     <value>hivepassword</value> 
     <description>password for connecting to mysql server</description> 
    </property> 

據我已閱讀,如果我們配置hive.metastore.uris火花會連接到蜂房metastore,但在我的情況下,它不是,並給我以上的錯誤。

回答

0

要在蜂房嘗試配置火花

+0

感謝您輸入您的蜂房的site.xml複製到火花/ conf目錄,但正如我所說,我已經創建了蜂房的site.xml內/ conf目錄 – Saurab

+0

是同樣的hive-site.xml也在/ hive/conf中? –

+0

是的,配置單元和spark中的hive-site.xml是一樣的 – Saurab