pyspark.sql.functions.hour#
- pyspark.sql.functions.hour(col)[source]#
Extract the hours of a given timestamp as integer.
New in version 1.5.0.
Changed in version 3.4.0: Supports Spark Connect.
Changed in version 4.1.0: Added support for time type.
- Parameters
- col
Column
or column name target date/time/timestamp column to work on.
- col
- Returns
Column
hour part of the timestamp as integer.
See also
Examples
Example 1: Extract the hours from a string column representing timestamp
>>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([('2015-04-08 13:08:15',), ('2024-10-31 10:09:16',)], ['ts']) >>> df.select("*", sf.typeof('ts'), sf.hour('ts')).show() +-------------------+----------+--------+ | ts|typeof(ts)|hour(ts)| +-------------------+----------+--------+ |2015-04-08 13:08:15| string| 13| |2024-10-31 10:09:16| string| 10| +-------------------+----------+--------+
Example 2: Extract the hours from a timestamp column
>>> import datetime >>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([ ... (datetime.datetime(2015, 4, 8, 13, 8, 15),), ... (datetime.datetime(2024, 10, 31, 10, 9, 16),)], ['ts']) >>> df.select("*", sf.typeof('ts'), sf.hour('ts')).show() +-------------------+----------+--------+ | ts|typeof(ts)|hour(ts)| +-------------------+----------+--------+ |2015-04-08 13:08:15| timestamp| 13| |2024-10-31 10:09:16| timestamp| 10| +-------------------+----------+--------+
Example 3: Extract the hours from a time column
>>> import datetime >>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([ ... ("13:08:15",), ... ("10:09:16",)], ['t']).withColumn("t", sf.col("t").cast("time")) >>> df.select("*", sf.typeof('t'), sf.hour('t')).show() +--------+---------+-------+ | t|typeof(t)|hour(t)| +--------+---------+-------+ |13:08:15| time(6)| 13| |10:09:16| time(6)| 10| +--------+---------+-------+