我在連接期間遇到問題。如何連接spark與hbase
scala> val results = sql("SELECT * FROM tablename");
results: org.apache.spark.sql.DataFrame = [hbid: string, matrix_col: string, matrix_value_col: double, country_col: string]
scala> results.show();
org.apache.hadoop.hbase.client.RetriesExhaustedException:失敗之後嘗試 = 36,不同的是: 星期二年07月25 10點03分27秒SGT 2017,NULL,java.net.SocketTimeoutException: callTimeout = 60000, callDuration = 68549:行'nrd_app_spt:capacity_new ,, 00000000000000' 表'hbase:meta'at region = hbase:meta ,, 1.1588230740, hostname = x01shdpeapp3a.sgp.dbs.com,60020,1500273862255 ,seqNum = 0 引起:java.net.SocketTimeoutException:callTimeout = 60000,callDuration = 68549:row'nrd_app_spt:capacity_new ,, 00000000000000'on table'hbase:meta'at region = hbase:meta ,, 1.1588230740, hostname = x01shdpeapp3a.sgp.dbs.com,60020,1500273862255,seqNum = 0 引起:org.apache.hadoop.hbase.exceptions.ConnectionClosingException :致電 x01shdpeapp3a.sgp.dbs.com/10.92.139.145:60020本地失敗 異常: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: 與x01shdpeapp3a.sgp.dbs.com/10.92.139.145的連接:60020是 收盤。 Call id = 9,waitTime = 3
我想讀在HBase的表,並做一些內置功能:問題在HBase的-site.xml中更改以下屬性
變化解決。 scala是否足以寫入查詢? –