2016-12-04 379 views

回答

3

貌似你試圖調用readSparkContext,而不是SQLContextSparkSession

// New 2.0.+ API: create SparkSession and use it for all purposes: 
val session = SparkSession.builder().appName("test").master("local").getOrCreate() 
session.read.load("/file") // OK 

// Old <= 1.6.* API: create SparkContext, then create a SQLContext for DataFrame API usage: 
val sc = new SparkContext("local", "test") // used for RDD operations only 
val sqlContext = new SQLContext(sc) // used for DataFrame/DataSet APIs 

sqlContext.read.load("/file") // OK 
sc.read.load("/file") // NOT OK 
0

爲sqlcontext功能完整的語法如下

val df = sqlContext 
       .read() 
       .format("com.databricks.spark.csv") 
       .option("inferScheme","true") 
       .option("header","true") 
       .load("path to/data.csv"); 

在您正在閱讀/寫作csv文件的情況下

相關問題