2016-11-17 98 views

回答

10

可以cast列於日期:

斯卡拉:

import org.apache.spark.sql.types.DateType 

val newDF = df.withColumn("dateColumn", df("timestampColumn").cast(DateType)) 

Pyspark:

df = df.withColumn('dateColumn', df['timestampColumn'].cast('date')) 
+2

這不是Spark SQL。 – dslack

+1

@dslack該解決方案使用作爲Spark SQL包的一部分提供的函數,但它不使用SQL語言,而是使用健壯的DataFrame API和SQL類函數,而不是使用不太可靠的字符串和實際的SQL查詢。 –

+0

關於SQL查詢的可靠性較低? – dslack

3

在SparkSQL:

SELECT CAST(the_ts AS DATE) AS the_date FROM the_table

0

想象一下以下輸入:

val dataIn = spark.createDataFrame(Seq(
     (1, "some data"), 
     (2, "more data"))) 
    .toDF("id", "stuff") 
    .withColumn("ts", current_timestamp()) 

dataIn.printSchema 
root 
|-- id: integer (nullable = false) 
|-- stuff: string (nullable = true) 
|-- ts: timestamp (nullable = false) 

可以使用to_date功能:

val dataOut = dataIn.withColumn("date", to_date($"ts")) 

dataOut.printSchema 
root 
|-- id: integer (nullable = false) 
|-- stuff: string (nullable = true) 
|-- ts: timestamp (nullable = false) 
|-- date: date (nullable = false) 

dataOut.show(false) 
+---+---------+-----------------------+----------+ 
|id |stuff |ts      |date  | 
+---+---------+-----------------------+----------+ 
|1 |some data|2017-11-21 16:37:15.828|2017-11-21| 
|2 |more data|2017-11-21 16:37:15.828|2017-11-21| 
+---+---------+-----------------------+----------+ 

我建議喜歡這些方法在鑄造和普通的SQL。

相關問題