pyspark.sql.functions.crc32#
- pyspark.sql.functions.crc32(col)[source]#
- Calculates the cyclic redundancy check value (CRC32) of a binary column and returns the value as a bigint. - Changed in version 3.4.0: Supports Spark Connect. - Parameters
- colColumnor column name
- target column to compute on. 
 
- col
- Returns
- Column
- the column for computed results. 
 - New in version 1.5.0: .. 
 - Examples - >>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([('ABC',)], ['a']) >>> df.select('*', sf.crc32('a')).show(truncate=False) +---+----------+ |a |crc32(a) | +---+----------+ |ABC|2743272264| +---+----------+