pyspark.sql.functions.make_timestamp_ntz#
- pyspark.sql.functions.make_timestamp_ntz(years, months, days, hours, mins, secs)[source]#
- Create local date-time from years, months, days, hours, mins, secs fields. If the configuration spark.sql.ansi.enabled is false, the function returns NULL on invalid inputs. Otherwise, it will throw an error instead. - New in version 3.5.0. - Parameters
- yearsColumnor str
- The year to represent, from 1 to 9999 
- monthsColumnor str
- The month-of-year to represent, from 1 (January) to 12 (December) 
- daysColumnor str
- The day-of-month to represent, from 1 to 31 
- hoursColumnor str
- The hour-of-day to represent, from 0 to 23 
- minsColumnor str
- The minute-of-hour to represent, from 0 to 59 
- secsColumnor str
- The second-of-minute and its micro-fraction to represent, from 0 to 60. The value can be either an integer like 13 , or a fraction like 13.123. If the sec argument equals to 60, the seconds field is set to 0 and 1 minute is added to the final timestamp. 
 
- years
- Returns
- Column
- A new column that contains a local date-time. 
 
 - Examples - Example 1: Make local date-time from years, months, days, hours, mins, secs. - >>> import pyspark.sql.functions as sf >>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles") >>> df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887]], ... ["year", "month", "day", "hour", "min", "sec"]) >>> df.select(sf.make_timestamp_ntz( ... df.year, df.month, df.day, df.hour, df.min, df.sec) ... ).show(truncate=False) +----------------------------------------------------+ |make_timestamp_ntz(year, month, day, hour, min, sec)| +----------------------------------------------------+ |2014-12-28 06:30:45.887 | +----------------------------------------------------+ >>> spark.conf.unset("spark.sql.session.timeZone")