2017-07-28 105 views
1

我在OpenStack上安裝了DC/OS VERSION 1.9.2。 我嘗試在DC/OS上安裝apache spark。無法在DCOS上安裝Apache Spark

dcos package install spark 
Installing Marathon app for package [spark] version [1.1.0-2.1.1] 
Installing CLI subcommand for package [spark] version [1.1.0-2.1.1] 
New command available: dcos spark 
DC/OS Spark is being installed! 

但是DC/OS儀表板顯示Spark正在部署並且任務沒有運行。 錯誤showw此消息。

I0728 16:43:36.348244 14038 exec.cpp:162] Version: 1.2.2 
I0728 16:43:36.656839 14046 exec.cpp:237] Executor registered on agent abf187f4-ad7d-4ead-9437-5cdba4f77bdc-S1 
+ export DISPATCHER_PORT=24238 
+ DISPATCHER_PORT=24238 
+ export DISPATCHER_UI_PORT=24239 
+ DISPATCHER_UI_PORT=24239 
+ export SPARK_PROXY_PORT=24240 
+ SPARK_PROXY_PORT=24240 
+ SCHEME=http 
+ OTHER_SCHEME=https 
+ [[ '' == true ]] 
+ export DISPATCHER_UI_WEB_PROXY_BASE=/service/spark 
+ DISPATCHER_UI_WEB_PROXY_BASE=/service/spark 
+ grep -v '#https#' /etc/nginx/conf.d/spark.conf.template 
+ sed s,#http#,, 
+ sed -i 's,<PORT>,24240,' /etc/nginx/conf.d/spark.conf 
+ sed -i 's,<DISPATCHER_URL>,http://172.16.129.180:24238,' /etc/nginx/conf.d/spark.conf 
+ sed -i 's,<DISPATCHER_UI_URL>,http://172.16.129.180:24239,' /etc/nginx/conf.d/spark.conf 
+ sed -i 's,<PROTOCOL>,,' /etc/nginx/conf.d/spark.conf 
+ [[ '' == true ]] 
+ [[ -f hdfs-site.xml ]] 
+ [[ -n '' ]] 
+ exec runsvdir -P /etc/service 
+ + mkdirmkdir -p -p /mnt/mesos/sandbox/nginx /mnt/mesos/sandbox/spark 

+ exec 
+ exec svlogd /mnt/mesos/sandbox/nginx 
+ exec svlogd /mnt/mesos/sandbox/spark 
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 
nginx: [emerg] could not build the types_hash, you should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 

如何在DCOS上運行Spark任務。 謝謝。

+0

我登錄引發泊塢窗。我試着將type_hash_buck_size改爲64.部署完成。但是這個過程並不酷。 –

+0

我檢查了DCOS日誌。我猜這個部署已經過期,但Docker中的NGINX無法運行。 DCOS檢查了火花健康狀況,但NGINX沒有迴應。 DCOS認識到火花不會死亡並殺死火花。 Spark嘗試再次部署。現在,這個過程是無止境的。 –

回答

0

檢查您是否正確卸載了以前的火花安裝。你必須刪除舊的動物園管理員spark項目(在/參展商)。

同時檢查是否沒有任何殭屍框架阻止新部署的資源。殺了他們:

http://MESOSMASTER_URL:5050/master/teardown -d 'frameworkId='<FRAMEWORKID>'' 
+0

我試圖卸載spark,並刪除舊的zookeeper spark條目。我查了殭屍框架,我coluld找不到它。 –