2016-04-22 297 views
2

Hy,PySpark - 系統找不到指定的路徑

我已經多次運行Spark(Spyder IDE)。 今天我得到這個錯誤(它的代碼是相同的)

from py4j.java_gateway import JavaGateway 
gateway = JavaGateway() 

os.environ['SPARK_HOME']="C:/Apache/spark-1.6.0" 
os.environ['JAVA_HOME']="C:/Program Files/Java/jre1.8.0_71" 
sys.path.append("C:/Apache/spark-1.6.0/python/") 
os.environ['HADOOP_HOME']="C:/Apache/spark-1.6.0/winutils/" 

from pyspark import SparkContext 
from pyspark import SparkConf 

conf = SparkConf() 
    The system cannot find the path specified. 
    Traceback (most recent call last): 
     File "<stdin>", line 1, in <module> 
     File "C:\Apache\spark-1.6.0\python\pyspark\conf.py", line 104, in __init__ 
     SparkContext._ensure_initialized() 
     File "C:\Apache\spark-1.6.0\python\pyspark\context.py", line 245, in _ensure_initialized 
     SparkContext._gateway = gateway or launch_gateway() 
     File "C:\Apache\spark-1.6.0\python\pyspark\java_gateway.py", line 94, in launch_gateway 
     raise Exception("Java gateway process exited before sending the driver its port number") 
    Exception: Java gateway process exited before sending the driver its port number 

去錯了呢? 感謝您的時間。

+0

我認爲你應該按照這個鏈接提到的答案http://stackoverflow.com/questions/30763951/spark-context-sc-not-defined/30851037#30851037 –

+0

好的...有人安裝了一個新的Java版本在VirtualMachine中。我只改變這個os.environ ['JAVA_HOME'] =「C:/ Program Files/Java/jre1.8.0_91」並重新運行。 – Kardu

回答

3

好的...有人在VirtualMachine中安裝了一個新的java版本。我只是改變這個

os.environ['JAVA_HOME']="C:/Program Files/Java/jre1.8.0_91" 

並再次工作。 你的時間。

相關問題