2017-09-14 83 views
1

dsetool狀態datastax - 無法連接到DSE資源管理器上的火花提交

DC: dc1  Workload: Cassandra  Graph: no 
====================================================== 
Status=Up/Down 
|/ State=Normal/Leaving/Joining/Moving 
-- Address   Load    Owns     VNodes            Rack   Health [0,1] 
UN 192.168.1.130  810.47 MiB  ?     256            2a   0.90 
UN 192.168.1.131  683.53 MiB  ?     256           2a   0.90 
UN 192.168.1.132  821.33 MiB  ?     256           2a   0.90 

DC: dc2  Workload: Analytics  Graph: no  Analytics Master: 192.168.2.131 
    ========================================================================================= 
Status=Up/Down 
|/ State=Normal/Leaving/Joining/Moving 
-- Address   Load    Owns     VNodes           Rack   Health [0,1] 
UN 192.168.2.130  667.05 MiB  ?     256           2a   0.90 
UN 192.168.2.131  845.48 MiB  ?     256           2a   0.90 
UN 192.168.2.132  887.92 MiB  ?     256           2a   0.90 

當我嘗試啓動火花提交工作

dse -u user -p password spark-submit --class com.sparkLauncher test.jar prf 

我得到出現以下錯誤(編輯)

ERROR 2017-09-14 20:14:14,174 org.apache.spark.deploy.rm.DseAppClient$ClientEndpoint: Failed to connect to DSE resource manager 
java.io.IOException: Failed to register with master: dse://? 

....

Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: The method DseResourceManager.registerApplication does not exist. Make sure that the required component for that method is active/enabled 

....

ERROR 2017-09-14 20:14:14,179 org.apache.spark.deploy.rm.DseSchedulerBackend: Application has been killed. Reason: Failed to connect to DSE resource manager: Failed to register with master: dse://? 
org.apache.spark.SparkException: Exiting due to error from cluster scheduler: Failed to connect to DSE resource manager: Failed to register with master: dse://? 

....

WARN 2017-09-14 20:14:14,179 org.apache.spark.deploy.rm.DseSchedulerBackend: Application ID is not initialized yet. 
ERROR 2017-09-14 20:14:14,384 org.apache.spark.SparkContext: Error initializing SparkContext. 
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem 

ERROR 2017-09-14 20:14:14,387 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to start or submit Spark application 
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem 

我可以證實,作爲本文檔中提到,我已授予的權限, https://docs.datastax.com/en/dse/5.1/dse-admin/datastax_enterprise/security/secAuthSpark.html 我在AWS上嘗試了這一點,如果這有所作爲,我可以確認節點之間的路由全部打開。 我可以從任何火花節點啓動火花外殼,可以調出Spark UI,可以從cqlsh命令獲取火花主人

任何指針都會有幫助,提前感謝!

回答

0

由於某種原因,我無法定位,我可以按羣集模式運行它,但不能在客戶端模式下運行

1

主地址必須指向啓用了數據中心的有效數據中心中的一個或多個節點。

Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: 
The method DseResourceManager.registerApplication does not exist. 
Make sure that the required component for that method is active/enabled``` 

表示連接的節點未啓用分析功能。

如果您從非分析節點運行,則必須仍指向主UI中的某個分析節點。

dse://[Spark node address[:port number]]?[parameter name=parameter value;]... 

默認情況下,dse://? URL連接到本地主機爲它的初始聚類連接。

有關更多信息,請參閱documentation

+0

@DataStax!我正在運行來自主節點的spark-submit,那是你所指的? – avinash

+0

您可以從任何啓用分析的節點運行它。如果您仍然收到該消息,則表示Analytics模塊未運行。我會檢查系統日誌 – RussS

相關問題