2017-08-07 122 views
0

每次我運行pyspark我得到這些錯誤,如果我忽略他們,當我只是寫sc它給NameError:名稱'sc'沒有定義任何幫助?Pyspark給我每次開始

pyspark 
Python 2.7.12 (default, Nov 19 2016, 06:48:10) 
[GCC 5.4.0 20160609] on linux2 
Type "help", "copyright", "credits" or "license" for more information. 
17/08/07 13:57:59 WARN NativeCodeLoader: Unable to load native-hadoop  library for your platform... using builtin-java classes where applicable 
Traceback (most recent call last): 
File "/usr/local/spark/python/pyspark/shell.py", line 45, in <module> 
spark = SparkSession.builder\ 
File "/usr/local/spark/python/pyspark/sql/session.py", line 169, in getOrCreate 
sc = SparkContext.getOrCreate(sparkConf) 
File "/usr/local/spark/python/pyspark/context.py", line 334, in getOrCreate 
SparkContext(conf=conf or SparkConf()) 
File "/usr/local/spark/python/pyspark/context.py", line 118, in __init__ 
conf, jsc, profiler_cls) 
File "/usr/local/spark/python/pyspark/context.py", line 186, in _do_init 
self._accumulatorServer = accumulators._start_update_server() 
File "/usr/local/spark/python/pyspark/accumulators.py", line 259, in _start_update_server 
server = AccumulatorServer(("localhost", 0), _UpdateRequestHandler) 
File "/usr/lib/python2.7/SocketServer.py", line 417, in __init__ 
self.server_bind() 
File "/usr/lib/python2.7/SocketServer.py", line 431, in server_bind 
self.socket.bind(self.server_address) 
File "/usr/lib/python2.7/socket.py", line 228, in meth 
return getattr(self._sock,name)(*args) 
socket.gaierror: [Errno -2] Name or service not known 
+0

請星火的版本您使用的是哪個分配它到底是什麼?謝謝 –

+0

感謝您關注我在搜索1周後解決了我的問題,發現解決方案是通過將localhost添加到/ etc/hosts文件中!我做了,每件事情都很好 – EngAhmed

回答

0

1周搜索後我就找到了解決辦法只是通過附加本地主機文件/ etc/hosts文件,然後每一件事情都很順利