spear
spear copied to clipboard
Spark sql function
@liancheng I want to consult a question about spark sql function: from_utc_timestamp(ts: Column, tz: String) I am using mongo-spark to load "member" collection from mongodb which is included three fields: memberId, date, timezone. case class Member(memberId: String, date: Timestamp, timezone: String) val memberDF: Dataframe = load [ Member ] ("member") I want to invoke from_utc_timestamp to get member's timezone timestamp, memberDF.select(memberId, from_utc_timestamp(date, timezone)), however, tz type is String, it is not a column type. how to implement from_utc_timestamp(ts:Column, tz:Column)?
def from_utc_timestamp(ts: Column, tz: String): Column = withExpr { FromUTCTimestamp(ts.expr, Literal(tz)) }
withExpr is private method......
Thanks, Aaron
Hello! The timestamp always has UTC+0, You need the member's location to deduct the time zone
Thanks @Seidzi ! I want to create daily report by local date, so if from_utc_timestamp(ts:Column, tz:Column) is implemented then I can get the local date by timezone: to_date(from_utc_timestamp(current_timestamp, timezoneId))